Posts Tagged 'Amazon Web Services'

The Cloud goes to Hollywood

Earlier this week I attended a one day seminar presented by Amazon Web Services in Los Angeles entitled “Digital Media in the AWS Cloud”. Since I was involved in a media project recently I wanted to see what services Amazon and some of their partners offer specifically to handle media workloads. Some of these services I had worked with before and others were new to me.

The five areas of consideration are:

  1. Ingest, Storage and Archiving
  2. Processing
  3. Security
  4. Delivery
  5. Automating workflows

Media workflows typically involve many huge files. To facilitate moving these assets into the cloud Amazon offers a service called Amazon Direct Connect. This service allows you to bypass the public Internet and create a dedicated network connection into AWS. This allows for transfer speeds up to 10 Gb/s. A fast file transfer product from Aspera and an open source solution called Tsunami UDP were also showcased as a way to reduce upload time. Live data is typically uploaded to S3 and then archived in Glacier. It turns out the archiving can be accomplished automatically by simply setting a lifecycle rule for objects in buckets that automatically moves them to Glacier at a certain date or when the objects reach a specified age. Pretty cool. I had not tried that before but I certainly will now!

For processing Amazon has recently added a service called Elastic Transcoder. Although technically still considered to be in beta this service looks extremely promising. It provides a cost effective way to transcode video files in a highly scalable manner using the familiar cloud on-demand, self-service payment and provisioning model. This lowers the barriers to entry for smaller studios which may have previously been unable to afford the large capital investment required to acquire on-premises transcoding capabilities.

In terms of security I was delighted to learn that AWS complies with the best practices established by Motion Picture Association of America (MPAA) for storage, processing and privacy of media assets. This means that developers who create solutions on top of AWS are only responsible for creating compliance at the operating system and application layers. It seems that Hollywood, with its very legitimate security concerns, is beginning to trust Amazon’s shared responsibility model.

Delivery is accomplished using Amazon’s CloudFront service. This service offers caching of media files to globally distributed edge locations which are geographically close to users. CloudFront works very nicely in conjunction with S3 but can also be used to cache static content from any web server whether it is running on EC2 or not.

Finally, the workflows can be automated using the Simple Workflow Service (SWF). This service provides a robust way to coordinate tasks and manage state asynchronously for use cases that involve multiple AWS services. In this way the entire pipeline from ingest through processing can be specified in a workflow then scaled and repeated as required.

So, in summary, there is an AWS offering for many of the requirements needed to produce a small or feature length film. The elastic scalability of the services allows both small and large players to compete by only paying for the resources they need and use. In addition there are many specialized AMIs available in the AWS Marketplace which are specifically built for media processing. That, however, is a discussion for another time!

To learn more about how AWS can be leveraged to process your workload (media or otherwise) you might like to attend Learning Tree’s Hands-on Amazon Web Services course.

Kevin Kell

Ch … Ch … Changes in the Cloud!

Wow!

You take a few months break from teaching cloud computing and when you come back to it things are not the same. Most notably, there are several significant new features in Amazon Web Services and Microsoft Azure.

Let’s first take a look at Azure.

The Management Portal has completely changed. It now has a clean new look that gives clear access to all of the Azure services. Notice all the additional services that are now available beyond the basic cloud services, database, storage and service bus.

Figure 1. Azure Management Portal

It is nice to see that Microsoft have addressed the needs of people who want to just host a simple web site on Azure. This was not very cost effective using cloud services but now up to 10 web sites per region can be deployed for free using the Azure web sites service. There is also a smaller sized SQL database available for $5 per month which may be adequate for such applications.

For me the other exciting development is Business Analytics as a service. A key component of this is SQL Reporting. The SQL Reporting service looks to essentially be SQL Server Reporting Services (SSRS) in the cloud. At $0.16 per hour per deployed report server instance it is still a little expensive (imho) but is significantly reduced from the astronomical $.88 per hour it was previously. Business Analytics also includes an implementation of Hadoop which is going to very accessible and familiar for .NET developers.

The Amazon Web Services Management Console has also had a facelift.

Figure 2 AWS Management Console

Several services have been added including Redshift for data warehousing and OpsWorks for application management. Glacier, which is now several months old, offers a very cost effective solution for archiving. Just last week Amazon announced the ability to now copy Machine Images (AMI) from one region to another. This is a great enhancement and is something I have been wanting for a while. Certainly this opens up many options for high availability and disaster recovery solutions that span regions and not just availability zones.

Constant change is the nature of cloud computing. That definitely makes keeping up on the technologies a challenge. I look forward to getting back to teaching it now for a while. I am currently on the schedule to teach our Introduction to Cloud Computing course in April and June and our Amazon Web Services course in April and May. Why not consider joining us for one of those sessions?

Kevin Kell

Implementing a Private Cloud Solution

Last week I attended Learning Tree’s “Implementing a Private Cloud Solution” course at our Reston Education Center. It is a great course for anyone seeking in-depth technical details on how to build their own on-premises private cloud. The course also covers using a hosted private cloud solution and building secure connections to your own data center.

This course is not for the faint of heart! It is also not for the technically challenged! When you show up Tuesday morning you need to be prepared to work very hard for the next four days. The course author, Boleslav Sykora, has put together a fast paced session that gives you as much technical detail as you would ever want on the subject. It is the type of course where you will want to come early and stay late each and every day so you can work through all the extensive bonus exercises that are offered. I loved it and I think you will too!

We feature building two private clouds, one using Eucalyptus and another using Microsoft System Center, completely from scratch. There is a lot of Linux command line stuff and quite a bit of detailed networking configuration. This is exactly the reality of what is involved if you want to build your own private cloud. Over the four days you come to understand that private cloud computing is not some mystical, magical hype but is an evolution of solid fundamental concepts that have been around for some time. This course will appeal to technical professionals who want to gain real experience implementing solutions that will define the future of the on-premises data center.

For those who would prefer not to bother with the complexity of an internal private cloud implementation there are many hosted solutions to choose from. Probably the best known is Amazon’s Virtual Private Cloud (VPC). Once you use VPC on Amazon you will likely never go back to using EC2 without it.

In fact as I write this blog I am on a train heading to New York. There I will teach Learning Tree’s “Cloud Computing with Amazon Web Services” course. That, also, is a great course!

Because there are many private cloud implementations based on the Amazon EC2 model and API (particularly Eucalyptus) Amazon has kind of become the de facto standard for how Infrastructure as a Service (IaaS) is done. Even if you believe you would never use a public cloud for a production system there is much to be learned about cloud computing from Amazon. Beyond that the public cloud is a great place to do testing, development and proof-of-concept before investing the time and capital required to build your own private cloud. Public clouds such as Amazon can also become part of a hybrid solution that features the best of what private clouds and public clouds have to offer. Learning Tree’s Amazon Web Services course gives you hands-on experience with many aspects of Amazon’s cloud and shows you how to build solutions using the various services offered there.

So if you are a hardcore techie who wants to have end-to-end control over all aspects of a cloud solution come to Learning Tree’s private cloud course. If you would like to understand how to leverage the Amazon public cloud or to understand the service models of arguably the most dominant cloud provider in the world then come to Learning Tree’s Amazon Web Services course. Either way I hope to see you soon!

Kevin Kell

Creating Items and Exploring Tables in Amazon DynamoDB

Having earlier created some tables for our application we will now quickly see how to store some items.

Using the AWS SDK it could be done programmatically very simply as follows:

Figure 1. Code to populate the Players Table with sample Items.

A couple points to reinforced:

  • Not all items need to have the same set of attributes
  • Attributes can be single or multi-valued

Tables can be explored interactively using the AWS Explorer from within Visual Studio.

Figure 2. Explore the Players Table using AWS Explorer from within Visual Studio

Note that item order is not preserved nor is order of list entries in multi-valued attributes.

Alternatively, as of May 22, 2012 you can also use the AWS Management Console to explore, monitor and configure DynamoDB tables.

Okay, cool. So now what? Well we could (and we will) consider ways to query the data. We could also talk about the usual CRUD stuff but we are not going to do that right now. Instead our next game will be to wrap up an interface to our storage that can be implemented as a Web Service. This will allow us to further abstract from DynamoDB and define our interface in terms of objects in our problem domain. Under the covers we will we using DynamoDB but we will have isolated specific code so that if, in the future, we wanted to use something else for storage (SimpleDB, Azure Tables or even a relational database) it will be relatively straight forward to make the necessary changes.

That will be the subject of my next post. In the meantime you might want to check out some of the supplementary course materials available for Learning Tree’s Amazon Web Services course. While some of the programming references there are for Java and not C# you will find that the concepts are equally relevant.

Kevin Kell

Committing to the Cloud

When I teach Learning Tree’s Introduction to Cloud Computing course I often get questions about how components from different public cloud vendors can be used together in an integrated application. The answer, of course, is: “it all depends on the application.” Here I would like to give a more comprehensive response than that.

This will be the first of a series of blog posts that will explore that question in some detail. I will go through the process of building a real-world application using services from Microsoft, Amazon and Google. Further, I will do all development and testing in the cloud. Dev/Test is often a good use case for organizations wishing to get started in the cloud. I will use a very minimal local machine which will allow me simply to connect to cloud resources using only RDP, SSH and HTTP.

My application will be designed using an n-tier architecture. There will be, at a minimum, a tier for storage, business logic and presentation. Since I am attempting to illustrate interoperability between cloud vendors it may make some sense to host components for each of the architectural tiers with different providers. So, somewhat arbitrarily, I will choose to host the storage on Amazon Web Services (exact service to be defined later), the business logic on Azure (so I can program in C# J) and the presentation on Google App Engine (since it is very cost-effective).

Follow along over the next few weeks. We will go from square zero to a fully functional, interoperable web application that uses services from the “Big 3” public cloud providers. We will provision our development, testing and deployment environments from a lightweight client (tablet PC). All the while we will track cloud costs and development time.

This should be fun!

Kevin Kell

Using the AWS SDK for .NET is Fun, Easy and Productive!

As a programmer, one of the things I really like about Amazon Web Services is that there is SDK support for a variety of languages. That makes it easy to get started automating AWS solutions using tools you are already familiar with. My recent programming experience has been primarily with C#. I chose the Amazon SDK for .NET for my latest project since it was somewhat time critical (when are they not!?) and I had to go with a language I already knew pretty well.

The SDK download from Amazon includes a library for .NET, code samples and a Toolkit for Visual Studio. Once installed the toolkit provides a New Project template in Visual Studio that gives you a good place to start. You also get the AWS Explorer which makes it very easy to manage your Amazon resources right from within Visual Studio.

Figure 1 Visual Studio with AWS Toolkit installed

The library provides an intuitive object wrapper over the Amazon APIs. If you have used the Amazon command line tools or management console you should feel pretty comfortable with the .NET implementation. For example to use EC2 from within a C# application you create an instance of an EC2 client using the AWSClientFactory. You can then call methods on the AmazonEC2 object you create. These methods correspond to the command line commands and API calls you have already been using. The wizard even creates some sample code to get you going.

A simple method to launch an EC2 instance might look like this:

Figure 2 Simple Method to Launch an EC2 Instance

By providing support for multiple languages Amazon opens up AWS to developers from many backgrounds. Whether you program in Java, Ruby, PHP, Python or C# you will find an SDK that will get you started building solutions that leverage the many services offered by Amazon in the Cloud.

Kevin Kell

Cloud Computing with Amazon Web Services

Wow, time flys!

As we reach the end of September we look forward to the launch of the newest addition to Learning Tree’s cloud computing curriculum. Course 1205, Cloud Computing with Amazon Web ServicesTM: Hands-On is slated for our first public run in December and we are on target! This course picks up where our introductory course 1200 left off.

We assume a basic knowledge of cloud computing. This allows us to spend four full days exploring the various components of Amazon Web Services (AWS) in depth. We begin with an overview of the AWS architecture and then call out the main services to be discussed in detail. The services we cover include EC2, EBS, S3, SimpleDB, RDS, CloudWatch, CloudFront, SQS, VPC, Beanstalk, CloudFormation and SNS. In addition to the technical we also consider business and financial implications.

As this is a hands-on course we have exercises which guide the student through configuring and using AWS services. In the exercises we use the management console, the command line tools and the API (although this is not a programming course). It should be stressed, though, that in order to truly exploit the full power of AWS you need to move beyond the management console. Attendees should be familiar with using a command line interface.

We start by provisioning various compute and storage resources. The course also includes the creation of custom images as a way to adapt to business needs. Monitoring and scaling are employed as a means to track and respond to dynamic load requirements. Finally, as might be expected in a cloud course, we discuss AWS features that allow us to work securely in the cloud.

We’re pretty excited about the release of this course! We hope you are too. Check it out on our website and come join us!

Kevin Kell


Learning Tree Logo

Cloud Computing Training

Learning Tree offers over 210 IT training and Management courses, including Cloud Computing training.

Enter your e-mail address to follow this blog and receive notifications of new posts by e-mail.

Join 53 other followers

Follow Learning Tree on Twitter

Archives

Do you need a customized Cloud training solution delivered at your facility?

Last year Learning Tree held nearly 2,500 on-site training events worldwide. To find out more about hosting one at your location, click here for a free consultation.
Live, online training
.NET Blog

%d bloggers like this: