AWS Service Quotas, or AWS has a LOT of Services!
In our recent post, we unveiled unSkript Actions that can query AWS Service Quotas. Service quotas are limits imposed by AWS on how many times a certain AWS feature can be used. Most of them are adjustable with a simple request, and our post showed how to determine your Service Quota values, AND request an increase using unSkript.
In this post, I thought it might be fun to dig into AWS Service Quotas a bit deeper, and get s general idea of how Service Quotas fit into the AWS landscape.
To get the Service Quota value, you need to know the Service Name, and the Quota Code. But how do you get these values?
AWS Service Names
We can get all of the AWS Service Names using the Service Node endpoint. Running this call, we find that there are 221 named services in AWS (as of Feb 15, 2023). AWS gets a lot of flak for their naming conventions, but with so many services, if course some are going to have sup-optimal names. Lucky for us, we’ll be using the ServiceCode, and not the Service Name, so “AWS Systems Manager Incident Manager Contacts” is simply “ssm-contacts” and “AWS IAM Identity Center (successor to AWS Single Sign-On)” is just “sso.”
AWS Service Quotas
Next, we can run these 221 named services against the “List Service Quotas” endpoint to get all of the Service Quotas for all of the Services. Only 113 of AWS Services (51%) have features that have a service quota. Even with just half of services having quotas – there are a LOT of preset quotas in AWS: 2,629 of them in fact! (Feb 15,2023)
|Service Code||Count of Quota Name|
As Machine Learning can be a very expensive process, it is no surprise that Amazon SageMaker leads the pack with over 700 service quotas. 2nd in line is an oldie but a goodie, Amazon’s EC2 (debuted in 2006!) with 131 quotas.
What do we know about quotas?
The longest quota name belongs to Rekognition, and it is quote a mouthful:
Transactions per second per account for the Amazon Rekognition Image personal protective equipment operation DetectProtectiveEquipment. with a quota of 5. That’s a lot of words to say that the service can scan 5 frames per second to identify a helmet, face mask or gloves on anyone in the picture. This quota can be adjusted, if desired.
The largest Quota is ElasticFileSystem’s file size, weighing in at 52673613135872 bytes. (Which, if I did my math correctly, is 47.9 TB). This is a hard limit and cannot be adjusted.
The second largest quota is the Maximum number of rows in a dataset for Amazon Forecast, with a soft limit of 3 billion rows. You can request that this number be increased.
Of the 2,629 quotas, 2,003 can be adjusted (76%), and 626 (24%) cannot be changed.
Only 87 of our quotas have units (3.3%). 20 are time based, and the remaining 67 are data sizes (of varying magnitude):
These vary from 200 ms (API Gateway Maximum Integration Timeout) to 30 days: SageMaker’s Longest run time for an AutoML job from creation to termination. In case you were wondering, 30 days is also 2,592,000 seconds.)
When it comes to units, there’s nothing like arbitrarily multiplying by 1024 to change the units (and I see you MegaBits and your extra x8… but these are all throughput, so I’ll give that a pass).
The smallest value is Lookout Metrics Value Length at 40 Bytes, and the largest is RDS Total storage for all DB instances at 100,000 GB (or 97 TB).
The winner for the oddest size measurement goes to Elasticfilesystem’s Throughput per NFC Client at 524.288 MegaBytes.
While looking at the giant list of AWS Service Quotas, I thought it might be fun to look at the data more closely. It remains to be seen whether the unSkript team will continue to let me use PivotTables to look at data.
More importantly, the list of Service quotas – with the Service Code and Quota Code are all in one table, and we have published the Feb 15, 2023 list in the unSkript Docs.
If you’re interested in learning more about unSkript, join our Slack Community, or check out our Open Source repo, you can run unSkript Open Source locally with Docker!
Share your thoughts