[AZURE] Azure CLI Command

Vinsen Muliadi
3 min readJan 17, 2023

--

Update Feb 15, 2023: Fixing command on “Upload file to Microsoft Azure’s Container” section. The command should use double-dash -- instead of — . Also I add sample command where we upload multiples files from file system.

As someone new to the Microsoft Azure cloud ecosystem, I discovered that the Azure environment is really different from GCP’s gcloud and AWS’s awscli CLI command. This medium article will be my scratch notes regarding the command I’ve executed (mainly regarding the Azure Blob Storage) and its terminology (and will be plain as possible and will keep updated from time to time).

Terminology

Container: The container is equivalent to the bucket in GCS and S3. So, if you need to create a bucket in your Azure’s Blob Storage, that means you need to create a container.

Azure’s CLI Command

  • Login to your Microsoft Azure account
    az login . Using this command, the binary will print something to the stdout containing a link to authenticate the login session. Just follow the link and enter the 9 random strings to the form and you’re good to go.

    Note: I believe there is a more proper and secure way to connect to your account like using a service account like in GCP or IAM role in AWS. But in my case, I would like to use this method like what GCP’s have in gcloud auth login command.
  • List Microsoft Azure’s Container Content
    az storage container list --account-name <YOUR_ACCOUNT>. Will print JSON-formatted data that contains a list of the containers that you’ve in your account.

    Note: This command is similar to aws s3 list or gsutil ls command. Basically list all the buckets in your project. The difference is awscli and gsutil will print text in the default output model, az using JSON instead.
  • Upload file to Microsoft Azure’s Container
    az storage azcopy blob upload --container <CONTAINER_NAME> --source <SOURCE_FILE_IN_LOCAL_ENV> --account-name <YOUR_ACCOUNT> . Will upload your <SOURCE_FILE_IN_LOCAL_ENV> to the <CONTAINER_NAME> under <YOUR_ACCOUNT>.

    Note: This command is similar to aws s3 cp <SOURCE_FILE_IN_LOCAL_ENV> s3://<AWS_S3_BUCKET_NAME> or gsutil cp <SOURCE_FILE_IN_LOCAL_ENV> gs://<GCP_GCS_BUCKET_NAME>

    Sample to upload many files based on its extension. In this example, we’ll upload all ZIP files to the targetted container to the Azure Blob.
    IFS=$’\n’; for x in $(ls *.zip); do az storage azcopy blob upload --container <CONTAINER_NAME> --source ${x} --account-name <YOUR_ACCOUNT>; done
  • List files inside Microsoft Azure’s Container
    az storage blob list --container-name <CONTAINER_NAME> --account-name <YOUR_ACCOUNT>. Will print a list of files inside your container in JSON format. Usually, you need to parse the JSON using jq. Here’s the sample JSON data that this command print to the stdout. You might consider the right key to filter your needs.

    For example: az storage blob list --container-name <CONTINER_NAME> --account-name <YOUR_ACCOUNT> | jq -r '.[].name' will print the filename only.
[
{
"content": null,
"deleted": false,
"metadata": {},
"name": "my-random-data.json",
"properties": {
"Content-CRC64": null,
"appendBlobCommittedBlockCount": null,
"blobTier": "Hot",
"blobTierChangeTime": null,
"blobTierInferred": true,
"blobType": "BlockBlob",
"contentLength": 102400,
"contentRange": null,
"contentSettings": {
"cacheControl": null,
"contentDisposition": null,
"contentEncoding": null,
"contentLanguage": null,
"contentMd5": null,
"contentType": "application/json"
},
"copy": {
"completionTime": null,
"id": null,
"progress": null,
"source": null,
"status": null,
"statusDescription": null
},
"creationTime": "2023-01-16T15:31:56+00:00",
"deletedTime": null,
"encryptionKeySha256": null,
"etag": "0x8DAF7D6CD741D84",
"lastModified": "2023-01-16T15:31:56+00:00",
"lease": {
"duration": null,
"state": "available",
"status": "unlocked"
},
"pageBlobSequenceNumber": null,
"remainingRetentionDays": null,
"serverEncrypted": true
},
"snapshot": null
}
]

--

--