How to Secure Your Information on AWS: 10 Best Practices

0

The up to date Deep Root Analytics incident that exposed sensitive report of 198 million Americans, or almost all registered voters, was yet another return anecdotes of the risks that come with storing data in the cloud. The uncountable alarming part, perhaps, is that this massive leak of 1.1 terabytes of derogatory data—the “mother lode of all leaks,” as some have described it—could accept been easily avoided.This security incident highlighted the information that insiders pose as much a threat to an organization as hackers, flat though insiders may not be acting maliciously. Enterprises experience an average of 11 insider portents every month, whether from malicious or negligent insiders.In the Unfathomable Root Analytics’ case, it was simple negligence. The data repository was in an AWS S3 scuttle that had its access set to public, so anyone could find it—and download much of it—by cruising to an Amazon subdomain.The misconfiguration of the S3 bucket is a common mistake. IaaS plans like AWS are often overlooked by organizations, and the Deep Root Analytics crevice emphasizes the importance of a strategy that can help avoid this kidney of costly misstep.Ensuring proper configuration will also inform appropriate protect data from outside threats. The AWS platform itself has strong collateral thanks to extensive investments by Amazon, but even the strongest defenses can be invaded by resourceful and persistent bad actors. As we saw last year in the Dyn DDoS attack, a large-scale berate can still overwhelm the sophisticated security protocols of AWS.Understanding the Shared Fault ModelLike most cloud providers, AWS uses a shared task model. It means both the vendor and the customer are responsible for securing the figures. The vendor, Amazon, is responsible for the security “of the cloud,” i.e. its infrastructure that encompasses hosting facilities, hardware and software. Amazon’s responsibility includes charge against intrusion and detecting fraud and abuse.The customer, in turn, is chargeable for the security “in” the cloud, i.e. the organization’s own content, applications using AWS, and identity access managing, as well as its internal infrastructure like firewalls and network.Under this poser, Deep Root Analytics was the one liable for the recent data exposure—and the burdens will likely linger for a long time.How to Secure Your Information on the AWS PlatformThese best practices can serve as a starting point.Franchise CloudTrail across all AWS and turn on CloudTrail log validation. Enabling CloudTrail make allowances logs to be generated, and the API call history provides access to data cast resource changes. With CloudTrail log validation on, you can identify any changes to log orders after delivery to the S3 bucket.Enable CloudTrail S3 buckets access logging. These pails contain log data that CloudTrail captures. Enabling access logging wish allow you to track access and identify potential attempts at unauthorized access.Commission flow logging for Virtual Private Cloud (VPC). These flow logs allocate you to monitor network traffic that crosses the VPC, alerting you of anomalous motion like unusually high levels of data transfers.Provision access to sets or roles using identity and access management (IAM) policies. By attaching the IAM designs to groups or roles instead of individual users, you minimize the risk of unintentionally devoting excessive permissions and privileges to a user, as well as make permission-management uncountable efficient.Restrict access to the CloudTrail bucket logs and use multifactor authentication for pail deletion. Unrestricted access, even to administrators, increases the risk of unsanctioned access in case of stolen credentials due to a phishing attack. If the AWS account suits compromised, multifactor authentication will make it more difficult for hackers to hush up their trail.Encrypt log files at rest. Only users who experience permission to access the S3 buckets with the logs should have decryption franchise in addition to access to the CloudTrail logs.Regularly rotate IAM access frequency. Rotating the keys and setting a standard password expiration policy cures prevent access due to a lost or stolen key.Restrict access to commonly toughened ports, such as FTP, MongoDB, MSSQL, SMTP, etc., to required entities alone.Don’t use access keys with root accounts. Doing so can easily compromise the account and exhibit access to all AWS services in the event of a lost or stolen key. Create role-based accounts in place of—and avoid using root user accounts altogether.Terminate amateurish at keys and disable inactive users and accounts. Both unused access tone and inactive accounts increase the threat surface and the risk of compromise.If you’re using duty applications in AWS, you also need to follow best practices for custom relevance security. Don’t leave any loopholes for bad actors to exploit or for your IT team to let pass. Mistakes like those made by Deep Root Analytics can be hampered—and no organization can afford the implications of not paying attention to their policies and usages. Sekhar Sarukkai

Sekhar Sarukkai

About the Author: Sekhar Sarukkai is a Co-Founder and the Chief Scientist at Skyhigh Networks, high-pressure future innovations and technologies in cloud security. He brings more than 20 years of trial in enterprise networking, security, and cloud service development.Editor’s Note: The impressions expressed in this guest author article are solely those of the contributor, and do not like it reflect those of Tripwire, Inc.

Leave a Reply

Your email address will not be published. Required fields are marked *

21