LocalJob
- class braket.jobs.local.local_job.LocalQuantumJob(arn, run_log=None)[source]
Bases:
QuantumJobAmazon Braket implementation of a hybrid job that runs locally.
- Parameters:
arn (
str)run_log (
str|None)
Initializes a
LocalQuantumJob.- Parameters:
arn (str) – The ARN of the hybrid job.
run_log (str | None) – The container output log of running the hybrid job with the given arn.
- Raises:
ValueError – Local job is not found.
- classmethod create(device, source_module, entry_point=None, image_uri=None, job_name=None, code_location=None, role_arn=None, hyperparameters=None, input_data=None, output_data_config=None, checkpoint_config=None, aws_session=None, local_container_update=True)[source]
Creates and runs hybrid job by setting up and running the customer script in a local docker container.
- Parameters:
device (str) – Device ARN of the QPU device that receives priority quantum task queueing once the hybrid job begins running. Each QPU has a separate hybrid jobs queue so that only one hybrid job is running at a time. The device string is accessible in the hybrid job instance as the environment variable “AMZN_BRAKET_DEVICE_ARN”. When using embedded simulators, you may provide the device argument as a string of the form: “local:<provider>/<simulator_name>”.
source_module (str) – Path (absolute, relative or an S3 URI) to a python module to be tarred and uploaded. If
source_moduleis an S3 URI, it must point to a tar.gz file. Otherwise, source_module may be a file or directory.entry_point (str | None) – A str that specifies the entry point of the hybrid job, relative to the source module. The entry point must be in the format
importable.moduleorimportable.module:callable. For example,source_module.submodule:start_hereindicates thestart_herefunction contained insource_module.submodule. If source_module is an S3 URI, entry point must be given. Default: source_module’s nameimage_uri (str | None) – A str that specifies the ECR image to use for executing the hybrid job.
image_uris.retrieve_image()function may be used for retrieving the ECR image URIs for the containers supported by Braket. Default =<Braket base image_uri>.job_name (str | None) – A str that specifies the name with which the hybrid job is created. Allowed pattern for hybrid job name:
^(?!-)[A-Za-z0-9-]{1,50}(?<!-)$Default: f’{image_uri_type}-{timestamp}’.code_location (str | None) – The S3 prefix URI where custom code will be uploaded. Default: f’s3://{default_bucket_name}/jobs/{job_name}/script’.
role_arn (str | None) – This field is currently not used for local hybrid jobs. Local hybrid jobs will use the current role’s credentials. This may be subject to change.
hyperparameters (dict[str, Any] | None) – Hyperparameters accessible to the hybrid job. The hyperparameters are made accessible as a Dict[str, str] to the hybrid job. For convenience, this accepts other types for keys and values, but
str()is called to convert them before being passed on. Default: None.input_data (str | dict | S3DataSourceConfig | None) – Information about the training data. Dictionary maps channel names to local paths or S3 URIs. Contents found at any local paths will be uploaded to S3 at f’s3://{default_bucket_name}/jobs/{job_name}/data/{channel_name}. If a local path, S3 URI, or S3DataSourceConfig is provided, it will be given a default channel name “input”. Default: {}.
output_data_config (OutputDataConfig | None) – Specifies the location for the output of the hybrid job. Default: OutputDataConfig(s3Path=f’s3://{default_bucket_name}/jobs/{job_name}/data’, kmsKeyId=None).
checkpoint_config (CheckpointConfig | None) – Configuration that specifies the location where checkpoint data is stored. Default: CheckpointConfig(localPath=’/opt/jobs/checkpoints’, s3Uri=f’s3://{default_bucket_name}/jobs/{job_name}/checkpoints’).
aws_session (AwsSession | None) – AwsSession for connecting to AWS Services. Default: AwsSession()
local_container_update (bool) – Perform an update, if available, from ECR to the local container image. Optional. Default: True.
- Raises:
ValueError – Local directory with the job name already exists.
- Return type:
- Returns:
LocalQuantumJob – The representation of a local Braket Hybrid Job.
- property arn: str
The ARN (Amazon Resource Name) of the hybrid job.
- Type:
str
- property name: str
The name of the hybrid job.
- Type:
str
- property run_log: str
Gets the run output log from running the hybrid job.
- Raises:
ValueError – The log file is not found.
- Returns:
str – The container output log from running the hybrid job.
- state(use_cached_value=False)[source]
The state of the hybrid job.
- Parameters:
use_cached_value (bool) – If
True, uses the value most recently retrieved value from the Amazon BraketGetJoboperation. IfFalse, calls theGetJoboperation to retrieve metadata, which also updates the cached value. Default =False.- Return type:
str- Returns:
str – Returns “COMPLETED”.
- metadata(use_cached_value=False)[source]
When running the hybrid job in local mode, the metadata is not available.
- Parameters:
use_cached_value (bool) – If
True, uses the value most recently retrieved from the Amazon BraketGetJoboperation, if it exists; if does not exist,GetJobis called to retrieve the metadata. IfFalse, always callsGetJob, which also updates the cached value. Default:False.- Return type:
dict[str,Any]- Returns:
dict[str, Any] – None
- cancel()[source]
When running the hybrid job in local mode, the cancelling a running is not possible.
- Return type:
str- Returns:
str – None
- download_result(extract_to=None, poll_timeout_seconds=864000, poll_interval_seconds=5)[source]
When running the hybrid job in local mode, results are automatically stored locally.
- Parameters:
extract_to (str | None) – The directory to which the results are extracted. The results are extracted to a folder titled with the hybrid job name within this directory. Default=
Current working directory.poll_timeout_seconds (float) – The polling timeout, in seconds, for
result(). Default: 10 days.poll_interval_seconds (float) – The polling interval, in seconds, for
result(). Default: 5 seconds.
- Return type:
None
- result(poll_timeout_seconds=864000, poll_interval_seconds=5)[source]
Retrieves the
LocalQuantumJobresult persisted usingsave_job_resultfunction.- Parameters:
- Raises:
ValueError – The local job directory does not exist.
- Return type:
dict[str,Any]- Returns:
dict[str, Any] – Dict specifying the hybrid job results.
- metrics(metric_type=MetricType.TIMESTAMP, statistic=MetricStatistic.MAX)[source]
Gets all the metrics data, where the keys are the column names, and the values are a list containing the values in each row.
- Parameters:
metric_type (MetricType) – The type of metrics to get. Default: MetricType.TIMESTAMP.
statistic (MetricStatistic) – The statistic to determine which metric value to use when there is a conflict. Default: MetricStatistic.MAX.
Example
- timestamp energy
0 0.1 1 0.2
would be represented as: { “timestamp” : [0, 1], “energy” : [0.1, 0.2] } values may be integers, floats, strings or None.
- Return type:
dict[str,list[Any]]- Returns:
dict[str, list[Any]] – The metrics data.
- logs(wait=False, poll_interval_seconds=5)[source]
Display container logs for a given hybrid job
- Parameters:
wait (bool) –
Trueto keep looking for new log entries until the hybrid job completes; otherwiseFalse. Default:False.poll_interval_seconds (int) – The interval of time, in seconds, between polling for new log entries and hybrid job completion (default: 5).
- Return type:
None