Hava Blog and Latest News

In Cloud Computing This Week [July 17th 2020]

Written by Team Hava | July 17, 2020

This week's roundup of all the cloud news.

 

Hi Cloud Land, we've read all the cloud computing news from AWS, Azure and GCP again this week, so you don't have to.  

Lots of movement in the AWS camp this week proving they do have a Seoul.  The CDK for Terraform looks pretty cool as do the increased storage and compute offerings from the big three Cloud providers.

We've been busy enhancing our AWS VPC diagram generator amongst some other cloud visualization development.

Hope you find something of interest.

Fourth AZ for the AWS Seoul region now available ap-northeast-2d   

This week AWS added a fourth AZ to the Asia Pacific (Seoul) Region to support the high demand of our growing Korean customer base. This fourth AZ provides customers with additional flexibility to architect scalable, fault-tolerant, and highly available applications in the Asia Pacific (Seoul) Region.

Now Seoul becomes the forth Region with over four AZs following US East (N. Virginia), US West (Oregon), and Asia Pacific (Tokyo). AZs located in AWS Regions consist of one or more discrete data centers, each with redundant power, networking, and connectivity, and each housed in separate facilities.

Now, you can select the 4th AZ in Seoul region via AWS Management Console, command-line interface (CLI), and SDKs.

https://aws.amazon.com/blogs/aws/now-open-fourth-availability-zone-in-the-aws-asia-pacific-seoul-region/

 

Amazon Interactive Video Service.

The new Amazon Interactive Video Service,  allows you to add live video directly into your own apps and websites. You can now integrate interactive, low latency, live video into an application.

The service enables you to create a channel using either the Amazon Interactive Video Service (IVS) Console or the API. You can then use any standard streaming software to stream video to this channel, and the service does everything required to make the live video available to any viewer around the world. The service includes a player SDK that make it straightforward to get the live video integrated into your web, iOS, or Android project.

https://aws.amazon.com/blogs/aws/amazon-interactive-video-service-add-live-video-to-your-apps-and-websites/

AWS CDK for Terraform 

Infrastructure as Code (IaC) is a fundamental component of modern DevOps practices because it enables you to deploy any version of your application infrastructure at will, and facilitates the full lifecycle management of all the resources required to run and monitor your application. Organizations who have adopted DevOps practices often deploy hundreds or even thousands of changes to production a day, allowing them to deliver software faster, cheaper, and with lower risk.

When you explore the IaC options available today, you quickly discover customers have many choices, and two of the most popular for deploying infrastructure to AWS are CloudFormation, a service native to AWS, and Terraform, an open-source offering from HashiCorp.  HashiCorp is an AWS Partner Network (APN) Advanced Technology Partner and member of the AWS DevOps Competency, and Terraform is a widely used tool that allows you to create, update, and version your infrastructure. According to GitHub Octoverse, HashiCorp Configuration Language (HCL) as one of the fastest growing languages over the past several years.

CloudFormation YAML and Terraform HCL are popular IaC languages and the right fit for many customers’ use cases, however AWS often hear other customers say they want to define and provision infrastructure with the same familiar programming languages used to code their applications, rather than needing to learn a new domain specific language. The AWS Developer Tools team responded with the AWS CDK in 2019 for CloudFormation, and now, AWS and HashiCorp are proud to announce that they are bringing the CDK to Terraform.

https://aws.amazon.com/blogs/developer/introducing-the-cloud-development-kit-for-terraform-preview/

 

CDK Pipelines Developer Preview Announced 

The AWS Cloud Development Kit (AWS CDK) is an open-source software development framework to define cloud infrastructure in familiar programming languages and provision it through AWS CloudFormation. The AWS CDK consists of three major components:

  • The core framework for modeling reusable infrastructure components
  • A CLI for deploying CDK applications
  • The AWS Construct Library, a set of high-level components that abstract cloud resources and encapsulate proven defaults

The CDK makes it easy to deploy an application to the AWS Cloud from your workstation by simply running cdk deploy. This is great when you’re doing initial development and testing, but you should deploy production workloads through more reliable, automated pipelines.

It has always been possible to configure your preferred CI/CD system to deploy CDK applications continuously, but customers have been asking AWS to make it even easier and more turnkey. This makes perfect sense: one of the core tenets of the CDK has always been to simplify cloud application development as much as possible, so you can focus on the parts that are relevant to you.

This week AWS were happy to announce the Developer Preview release of CDK Pipelines. CDK Pipelines is a high-level construct library that makes it easy to set up a continuous deployment pipeline for your CDK applications, powered by AWS CodePipeline

https://aws.amazon.com/blogs/developer/cdk-pipelines-continuous-delivery-for-aws-cdk-applications/

Azure Data Factory Managed Virtual Network. 

Azure Data Factory is a fully managed, easy-to-use, serverless data integration, and transformation solution to ingest and transform all your data. Choose from over 90 connectors to ingest data and build code-free or code-centric ETL/ELT processes.

Security is a key tenet of Azure Data Factory. Customers want to protect their data sources and hope that data transmission occurs as much as possible in a secure network environment. Any potential man-in-the-middle or spoof traffic attack on public networks could bring problems of data security and data exfiltration.

Microsoft are glad to announce the preview of Azure Data Factory Managed Virtual Network. This feature provides you with a more secure and manageable data integration solution. With this new feature, you can provision the Azure Integration Runtime in Managed Virtual Network and leverage Private Endpoints to securely connect to supported data stores. Your data traffic between Azure Data Factory Managed Virtual Network and data stores goes through Azure Private Link which provides secured connectivity and eliminates your data exposure to the internet. With the Managed Virtual Network along with Private Endpoints, you can also offload the burden of managing virtual network to Azure Data Factory and protect against the data exfiltration.

https://azure.microsoft.com/en-us/blog/azure-data-factory-managed-virtual-network/

Azure announce shared disks are GA and new Azure Disk Storage enhancements.

Organizations are changing how they run their businesses and many are looking to accelerate their move to the cloud to take advantage of the benefits that the cloud offers, including lower total cost of ownership (TCO) and improved flexibility and security, without sacrificing on performance, application compatibility, and availability. Microsoft are committed to delivering new innovations to help their customers easily migrate their business-critical applications to Azure.

This week, they announced the general availability of shared disks on Azure Disk Storage—enabling you to more easily migrate your existing on-premises Windows and Linux-based clustered environments to Azure. Microsoft also announced important new enhancements for Azure Disk Storage to provide you with more availability, security, and flexibility on Azure.

https://azure.microsoft.com/en-us/blog/announcing-the-general-availability-of-azure-shared-disks-and-new-azure-disk-storage-enhancements/

AMD based memory-optimized Azure VMs now in more regions

Last year Microsoft announced the availability of the Azure Virtual Machine Da v4 and Das v4 series for general purpose Linux and Windows applications, and the Azure Virtual Machine Ea v4 and Eas v4 series for memory-intensive Linux and Windows workloads.

All four virtual machine series are based on the AMD EPYC™ 7452 processor.

This week Microsoft announced the expanded availability of these virtual machine (VM) sizes in new Azure regions and support for additional Availability Zones. This wider deployment offers your organization the ability to span and run your applications around the world.

https://azure.microsoft.com/en-us/blog/amd-based-memory-optimized-azure-virtual-machines-now-available-in-more-regions/

Azure Blob Storage increased to 200TB 

Azure Blob storage is a massively scalable object storage solution that serves from small amounts to hundreds of petabytes of data per customer across a diverse set of data types, including logging, documents, media, genomics, seismic processing, and more. Read the Introduction to Azure Blob storage to learn more about how it can be used in a wide variety of scenarios.

Azure customers that have workloads on-premises today utilize files that are limited by the filesystem used with file size maximums up to exabytes in size. Most usage would not go up to the filesystem limit but do scale up to the tens of terabytes in size for specific workloads that make use of large files.

Microsoft recently announced the preview of their new maximum blob size of 200 TB (specifically 209.7 TB), increasing our current limit of 5TB in size, which is a 40x increase! The increased size of over 200TB per object is much larger than other vendors that provide a 5TB max object size. This increase allows workloads that currently require multi-TB size files to be moved to Azure without additional work to break up these large objects.

https://azure.microsoft.com/en-us/blog/run-high-scale-workloads-on-blob-storage-with-new-200-tb-object-sizes/

Google C2C 

This week Google launched a new, independent community for GCP customers, C2C (Customer to Community). C2C is a platform that will bring together IT executives, developers, and other cloud professionals from Google Cloud customers across the globe. By building a community where GCP customers can learn, connect, and share knowledge, we can harness our collective power to create an even better cloud to address customer needs.

Customers who join C2C will receive exclusive networking opportunities, as well as visibility into the Google Cloud ecosystem, with benefits such as:

  • Opportunities to make connections and learn from other customers, including sharing knowledge and best practices through virtual and in-person events;

  • Expanded access to Google Cloud experts and content, such as knowledge forums, white papers, and methodologies;

  • Early and exclusive access to Google Cloud product roadmaps, with opportunities to provide feedback and act as customer-advisors.

Google are inviting all customers in North America and EMEA to join C2C, and we look forward to expanding the community to more regions and more customers in the coming weeks and months. Click here to join. 

https://cloud.google.com/blog/topics/inside-google-cloud/announcing-c2c 

GCP Confidential Computing.

Google believe the future of cloud computing will increasingly shift to private, encrypted services that give users confidence that they are always in control over the confidentiality of their data. 

Google Cloud encrypts data at-rest and in-transit, but customer data must be decrypted for processing. Confidential Computing is a breakthrough technology which encrypts data in-use—while it is being processed. Confidential Computing environments keep data encrypted in memory and elsewhere outside the central processing unit (CPU). 

Confidential VMs, now in beta, is the first product in Google Cloud’s Confidential Computing portfolio. They already employ a variety of isolation and sandboxing techniques as part of our cloud infrastructure to help make GCP multi-tenant architecture secure. Confidential VMs take this to the next level by offering memory encryption so that you can further isolate your workloads in the cloud. Confidential VMs can help Google customers protect sensitive data, but they think it will be especially interesting to those in regulated industries.

https://cloud.google.com/blog/products/identity-security/introducing-google-cloud-confidential-computing-with-confidential-vms

 

 

 

 

Upcoming Events:   

 

Alexa Live - July 22, 2020

Voice is becoming part of the tech landscape as is natural language processing. In this virtual developer education event, AWS will cover the Alexa Skills Kit (ASK), Voice Service, Connect kit, Smart home skill API 

The event is free.

https://aws.amazon.com/blogs/machine-learning/discover-the-latest-in-voice-technology-at-alexa-live-a-free-virtual-event-for-builders-and-business-leaders/

 

Google Cloud Next OnAir

Google's 9 Week Digital Event kicks off on July 14th with diverse topics being covered each week.

Productivity & Collaboration July 21st
Infrastructure July 28th
Security August 4th
Data Analytics August 11th
Data Management and Databases August 18th
Application Modernization August 25th
Cloud AI September 1st
Business Application Platform September 8th

 

Full Information and Session times here:  https://cloud.withgoogle.com/next/sf

Azure Virtual Events

Microsoft have a full schedule of Virtual Events

A  full list including session times and details are here : https://azure.microsoft.com/en-us/community/events/

AWS Events:

AWS events are pretty fluid at the moment, with most in-person events being cancelled or postponed. There are a number that have been taken online and full details can be found here: https://aws.amazon.com/events/

Thanks for reading, we hope you found something useful. 

hava.io allows users to visualise their AWS, GCP and Azure cloud environments in interactive diagram form including unique infrastructure, security and container views. hava.io continuously polls your cloud configuration and logs changes in a version history for later inspection which helps with issue resolution and provides history of all configs for audit and compliance purposes. This includes VPC infrastructure and AWS microservices architecture diagrams.

If you haven't taken a hava.io free trial to see what it can do for your workflow, security and compliance needs - please get in touch.

 

You can reach us on chat, email sales@hava.io or book a callback or demo below.