Why clouds become thunderclouds: cloud deployment failures
A cloud implementation in a large organization usually includes many services from various providers, each of which has its own rules for interaction, settings, and even protocols. As a result, the security configuration becomes so complex that it is difficult to track and even harder to understand. In our new study, we collected the most common cloud deployment failures Amazon Web Services as an example, and some of them will be shared in this post.
Cloud providers today offer services that go beyond the trivial “dedication” or file storage. Every aspect of every service can be programmed. This gives developers and operators more control over security than a traditional data center. However, the wealth of functions and their configuration tools leads to the fact that you can configure from several interfaces, and – and this is important – the default parameters are different for different methods.
For experienced users this is not a problem – on the contrary, they will use the most suitable tool for the task. However, for others, the result may not meet their expectations.
Amazon S3 Storage
Amazon Simple Storage Services or Amazon S3 is one of the most sought after cloud services used by many customers, from small companies to large corporations. Popularity has turned S3 into a favorite target for attackers who are looking for flaws in the implementation of services and errors in their configuration.
The most common Amazon S3 attack vectors used by cybercriminals are:
- public storage facilities for recording;
- interception of accounts;
- abuse of privileges.
The newspaper’s site was hosted by Amazon and stored all the pictures, scripts and style files in the S3 repository. Access to this repository was limited only by reading, so hacking was a complete surprise for site administrators. They simply could not understand how they were hacked until the cloud service specialists explained that the reason was the incorrect access rights.
Amazon S3 repositories can be used via http / https, and in this part, site administrators did everything right, restricting access only to reading. However, S3 can also be accessed through the AWS native protocol via the command line, and the access rights for such calls must be set separately, and by default, storage access via AWS CLI is allowed for all AWS authorized users.
The result of the console command aws s3api get-bucket-acl –bucket
For new Amazon S3 users, the need to double restrict access to their storage is far from obvious, which led to numerous incidents.
At the end of 2018, AWS redefined the security policy by prohibiting public access from the console for new S3 repositories, but this policy was not applied when using the command line. Access still had to be limited to a separate team.
In 2018-2019, the compromise of S3 storage became widespread. Some security professionals and friendly hackers specifically looked for AWS resources that were open for writing and left warning files there.
AWS S3 Insecure Configuration Anonymous Warning
Someone even offered their services for setting up secure storage parameters:
Warning and service offer from Pentester Random Robbie
Random Robbie is the pseudonym for Robbie Wiggins pentester, who in 2018 left his warning on the thousands of S3 storage open for recording.
The ability to freely modify sites hosted in S3 repositories was immediately taken advantage of by hackers. The Magecart group massively introduced malicious code to steal bank card data and user account information. After all, all that was required was to find a site that accepts payments and uses AWS. As a result, criminals managed to steal the data of hundreds of thousands of visitors to such resources.
Example data that a skimmer passed to criminals
Among the victims of the actions of Magecart are hundreds of online stores, including well-known brands.
In the course of the study, we found that, despite the many publications and recommendations regarding the safe configuration of AWS services, at least five online stores out of the number of compromised ones continue to use the S3 stores available for recording. To date, their sites do not contain skimmers, but they can be added at any time, since cybercriminals have convenient tools at their disposal to facilitate the search for vulnerable resources.
S3 Open Storage Search Tools
Slurp, Bucket Stream and s3scanner tools help you find readable and writable storage.
Slurp helps you find possible storage names for a given domain and check write permissions for them:
Slurp output example. Access by http is closed.
To check the availability of found storage through the AWS CLI, you can use the get-bucket-acl command:
Resource access through AWS API is also closed
The s3scanner utility written in Python, using a simple heuristic, finds possible storage names and checks access to them.
Search and check availability of S3 storage with s3scanner
The Bucket Stream utility searches for potentially vulnerable S3 repositories in publicly available sources, for example, in the magazines Certificate Transparency and others.
The AWSBucketDump utility lists the repository files whose names contain specific keywords:
AWSBucketDump Utility Result
Using these utilities, from December 2018 to January 2019 we found more than 5,200 unique S3 storages. About 4,400 of these were available for standard AWS command-line utilities. Only 79 of them were available for reading, and 40 for writing. To access part of them, it was simply necessary to assign the necessary rights.
How Accounts Leak
Access rights to resources are a source of headache for developers. And in the case of cloud funds, the problem becomes even more acute. Processes must be authenticated in order to gain access to resources, otherwise there is a risk of data theft or compromise. The whole question is how to do this, without risking compromising the data when publishing the code in the repository, as the author of the following fragment posted on Pastebin did:
Snippet of Pastebin-published code with valid AWS API ID and key
Using this data, the attacker can gain all the rights that are granted to the account of this application.
Another source of credential leaks is the Kubernetes API client certificates. On the one hand, in the default configuration, this container orchestration system requires mandatory protection in the form of a client certificate. On the other hand, developers with surprising naivete publish certificates along with code on GitHub, Pastebin and other services.
Only at Pastebin did we manage to find about fifty different client certificates placed with configuration scripts.
If publishing certificates in clear text anywhere is a bad idea, then posting it on GitHub is just disgusting, because:
- To delete a certificate, you have to delete it from all saved versions in the repository;
- if some developer fork your code, he will receive a copy of the certificate with him, which you cannot remove;
- you will not be able to prevent an attacker from using your certificate and, most likely, will not notice anything until real problems begin.
Compromised API keys and certificates can become a source of serious financial damage, like a Russian company owed Amazon $ 12 thousand for one day: it hacked a website on Bitrix, which, among other things, indicated an API key for access to the S3 storage.
Screenshot from the billing panel of a Russian company, the stolen API key of which was used to create many virtual machines and cryptocurrency mining. Source: habr.com/en/post/357764
No less painful can be the leak of customer data, as at Imperva in 2019. Imperva also stole the API key and merged all client data.
The stolen accounts can be used by cybercriminals to illegally trade dedicated AWS servers, which will have to be paid for by the real owners. On the lolzteam forum, we found more than 250 ads offering “pure zero Dedicos.”
Announcement on the lolzteam forum. Who will pay Amazon in the end?
A third source of API key leaks is a variety of training courses for programmers.
Trying to explain the process of connecting to AWS services to beginners as simple as possible, the authors replicate bad practice, which in the future leads to new and new cases of compromise.
A snippet from a Python course that explains how to work with Amazon S3 services. Keys are offered to hardcode into the program
The authors of the progressive and safe Java language course demonstrate this reckless attitude to API key security:
The language is different, but the advice is the same – the keys are right in the program text.
Incorrect configuration of cloud services poses many risks from the illegal use of leased computing resources for cryptocurrency mining to data theft and the introduction of online skimmers. In this regard, we recommend that security personnel analyze cloud deployment scenarios to identify potential vulnerabilities before the process is completed. Cloud validation scripts like AWS CloudFormation provide insight into how the resulting infrastructure will work, where to look for incorrect or missing security settings or logs. Among the security tools developed by Trend Micro, there is a product aimed at protecting cloud environments – Deep Security for Amazon EC2 instances. And the Cloud Conformity tool allows the company’s cloud environment for insecure settings.
For programmers using AWS API keys to interact with S3 repositories, we suggest switching to work through AWS Secrets Manager, Docker Secrets, Blackbox, git-secrets and other similar tools that will allow to avoid compromising and malicious use of credentials stored along with the original application texts.