AWS re:Invent is a big show. In official terms, Amazon Web Services confirms there are some 50,000 attendees for this conference, a figure that is joined by a claimed 300,000 online virtual registrants as well. This huge event is staged this week at The Venetian resort in Las Vegas from Nov. 28 to Dec. 2. Understanding how it changes cloud computing is an exercise in tiered comprehension.
Those tiers generally break down to big picture news that AWS says will shift its platform forward and enable growth; middle-tier, use case-dependent, industry-specific developments; and because this is cloud engineering, a set of lower-level substrate technology updates designed to delight technology practitioners.
Given the scope of the challenge ahead, let’s don our attendee lanyard, pack our hand sanitizer and take a tour of some of what AWS has brought to the party this year.
The source of cloud: Water
Just to start us off with a curveball, AWS went to the source of the cloud itself with its “day zero” pre-show news. The company focused on water. AWS has committed to becoming water positive (water+) by 2030, which means it will return more water to communities than it uses in its direct daily operations.
“In just a few years half of the world’s population is projected to live in water-stressed areas, so to ensure all people have access to water, we all need to innovate new ways to help conserve and reuse this precious resource,” said Adam Selipsky, CEO of AWS.
SEE: Top 5 things to know about corporate environmental goals (TechRepublic)
It’s what he calls “water stewardship in cloud operations,” which is an interesting use of the word stewardship; in technology circles, it’s a term more normally used to describe the body or organization that oversees the life of an open-source community, platform, product or developer language. But in a sense, it’s the correct term because nobody should actually “own” water.
Staying even bigger than the big picture, AWS invited Ukraine’s minister of digital transformation Mykhailo Fedorov on stage in Las Vegas this year. Expressing a warning not to simply apply cloud computing to public entities and, as he put it, thereby “digitize bureaucracy,” Fedorov called for world governments to instead focus on fixing people’s problems.
In an account that reflected the reality of life we have come to expect from our Ukrainian friends, Federov told the story of an early Russian attack that destroyed the building where the Ukrainian government stored all of its backup data. This building held the backup records of the country’s entire citizenry and institutions or, to put it in stark terms, the data that literally described the entire Ukrainian nation.
A new AWS Supply Chain lane
Getting straight into the product zone with AWS and staying with the big picture news, a central product update this year is the announcement of AWS Supply Chain. This is a new application designed to create supply chain visibility to make faster, more informed decisions that mitigate risks. Using a clearly presented web interface, a business can see all of its office, warehouse or other locations and view their stock inventory levels.
The application also automatically combines and analyzes data across multiple supply chain systems, so businesses can observe their operations in real time, find trends more quickly and generate more accurate demand forecasts. There’s a chat interface to enable central HQ to talk to inventory managers and a weighty dose of machine learning to help pinpoint stock overloads and shortfalls.
SEE: The power of intelligence in solving supply chain challenges (TechRepublic)
Arguably, its biggest initial news this year, and sensitively timed to coincide with the supply chain disruptions being experienced around the planet as we go into the start of 2023, AWS Supply Chain is said to overcome the major problems associated with integrating supply chains to enterprise resource planning systems.
“Customers tell us that the undifferentiated heavy lifting required in connecting data between different supply chain solutions has inhibited their ability to quickly see and respond to potential supply chain disruptions,” said Diego Pantoja-Navajas, vice president of AWS Supply Chain.
Pantoja-Navajas explains that in order to gain visibility into their supply chain network, businesses must build custom integrations that can access and process data across an array of ERP and supply chain management systems.
He said these projects introduce expensive third-party engagements and long-term development cycles. Worse still, they often fail to detect patterns that highlight supply chain problems. Through its use of ML, AWS has built those integrations, meaning a customer doesn’t have to manually click and link their way through a minefield of database fields.
Anyone for Nitro Chips?
AWS is essentially a software company, apart from when it’s making hardware. This year saw three new Amazon Elastic Compute Cloud (Amazon EC2) instances powered by three new AWS-designed chips. Under its Graviton, Trainium and Inferentia chip brands, AWS also makes software-defined hardware devices known as Nitro Cards.
Now with a decade of experience designing chips developed for performance and scalability in the cloud, AWS has introduced specialized chip designs built for specific cloud workloads. The devices sport varying characteristics that suit use cases which require faster processing, higher memory capacity, faster storage input/output and increased networking bandwidth.
Borrowing a term from the cleanroom locations it uses to fabricate its chips, AWS has now launched AWS Clean Rooms, an analytics service for organizations that need to work on mutual data but with a degree of separation and privacy.
AWS Clean Rooms enables two or more companies to securely analyze and collaborate on their combined datasets but without sharing or revealing underlying data. Users get a set of built-in data access controls that protect sensitive data, including query controls, query output restrictions, query logging and cryptographic computing tools.
SEE: The CHIPS Act has been signed into law: Now what? (TechRepublic)
According to Dilip Kumar, vice president of AWS Applications, companies across multiple industries increasingly look to complement their data with external business partners’ data to build a complete view of their business. In the advertising industry, for example, brands and media publishers often want to collaborate using datasets stored across different channels and applications to improve the relevance of their campaigns.
If the above AWS Clean Rooms example is middle-tier and industry-specific — AWS says this technology is also heavily applicable to the life sciences and financial services industries — then we can now reasonably move down one rung lower to the more granular end of the spectrum.
Into the DataZone
Amazon DataZone is a new data management service designed to make it easier to find data stored across AWS. Data can be found not just across AWS itself — this is a hybrid-cloud world with application programming interface-connected neurons to a plethora of external sources after all — but also across an organization’s own on-premises data estate and throughout third-party sources and even in “public” datasets. The latter public element is information certified to be open data or perhaps data residing in the Google Public Cloud Dataset Program.
With Amazon DataZone, administrators and data stewards who oversee an organization’s data assets can manage and govern access to data using fine-grained controls to ensure it is accessed with the right level of privileges and in the right context.
Before organizations can unlock the full value of this data, administrators and data producers who generate and manage data need to make it accessible, while maintaining control and governance to ensure it can only be accessed by the right person and in the right context. AWS says it has that governance aspect covered.
“Good governance is the foundation that makes data accessible to the entire organization, but we often hear from customers that it is difficult to strike the right balance between making data discoverable and maintaining control,” said Swami Sivasubramanian, vice president of databases, analytics and machine learning at AWS. “With Amazon DataZone, customers can use a single service that balances strong governance controls with streamlined access to make it easy to find, organize and collaborate with data.”
SEE: Data Governance vs Data Management: Main Differences (TechRepublic)
SimSpace Weavers for simulation access
With a new roster this year as big as ever, AWS also highlighted AWS SimSpace Weaver, a managed compute service that helps customers build, operate and run large-scale spatial simulations. This service enables users to run a simulation that has more than a million “entities” interacting in real time.
With AWS SimSpace Weaver, customers can deploy spatial simulations to model dynamic systems with many data points, such as traffic patterns across an entire city, crowd flows in a venue or factory-floor layouts, and then use the simulations to visualize physical spaces, perform immersive training and get insights on different scenarios to make informed decisions.
“Previously, if a customer wanted to scale up their spatial simulation, they had to balance the accuracy of the simulation with the capacity of their hardware, which limited the usefulness of what they could learn,” said Bill Vass, vice president of engineering at AWS. “AWS SimSpace Weaver removes the burden of managing simulation infrastructure, simplifying how customers run large-scale simulations and freeing them to focus on creating differentiated content and expanding access to simulation development.”
QuickSight to supplement BI
The company also announced five new capabilities to help customers streamline business intelligence using Amazon QuickSight, its serverless BI cloud service. This year sees an expansion of QuickSight Q, a natural language querying capability, to support forecast and “why” questions and automate data preparation.
Users can now create and share paginated reports alongside interactive dashboards, quickly analyze and visualize billion-row datasets directly in QuickSight, and programmatically create and manage BI assets to accelerate migration from legacy systems.
“Organizations today are composed of numerous stakeholders with varying levels of technical expertise — all of whom need access to critical business insights to make better decisions and share knowledge,” said Ganapathy Krishnamoorthy, vice president of analytics at AWS.
SEE: A guide to the best data intelligence software (TechRepublic)
Krishnamoorthy said the team has made QuickSight more intuitive, flexible and accessible while streamlining BI operations. The service might be used for a variety of scenarios, such as when users want to forecast sales using natural language queries, distribute critical operational reports, embed analytics in high-traffic websites or visually analyze massive datasets.
Amazon brings a cloud storm
Without directly analyzing market share and performing some completely separate breakdown of where Google Cloud Platform and Microsoft Azure sit in relation to AWS, a few trends and outliers do stand out.
AWS continues to be big, really big. The company has a girth that manifests itself not just in customers — the other cloud hyperscalers can surely hold their own on that front — but in platform breadth and product mechanics.
If that granular cloud engineer-level intricacy has led AWS to be regarded as the more complicated to deploy choice in the past, the company is clearly working hard to provide accelerators, automations and shortcuts to make that not so. Certainly, AWS holds media-welcomed events to detail its cloud stance more prevalently (some would say prolifically) than its counterparts.
The cloud storm continues to form; be sure to wear a hat.
Don’t miss the other re:Invent 2022 report from Vegas as AWS also revealed some notable partnerships.