Archive Page 2

Google Compute Engine Revisited

It has been awhile since I have written anything about Google Cloud Computing. I started to take a look at Google Compute Engine over a year ago but I was stopped because it was in limited preview and I could not access it. It looks like GCE has been made generally available since May so I thought I’d check back to see what has happened.

To use GCE you sign into Google’s Cloud Console using your Google account. From the Cloud Console you can also access the other Google cloud services: App Engine, Cloud Storage, Cloud SQL and BigQuery. From the Cloud Console you can create a Cloud Project which utilizes the various services.

Figure 1. Google Cloud Console

Unlike App Engine, which lets you create projects for free, GCE requires billing to be enabled up front. This, of course, will require you to create a billing profile and provide a credit card number. After that is done you can walk through a series of steps to launch a virtual machine instance. This is pretty standard stuff for anyone who has used other IaaS offerings.

Figure 2. Creating a new GCE instance

The choice of machine images is certainly much more limited than other IaaS vendors I’ve used. At this time there seems to be only four available and they are all Linux based. Probably Google and/or the user community will add more as time passes. It is nice to see the per-minute charge granularity which, in actual fact, is based on a minimum charge of 10 minutes and then 1 minute increments beyond that. The smallest instance type I saw, though, was priced at $0.115 per hour which makes GCE considerably more expensive than EC2, Azure and Rackspace. When you click the Create button it only takes a couple of minutes for your instance to become available.

Connecting to the instance seemed to me to be a little more complicated than other providers. I am used to using PuTTY as my ssh client since I work primarily on a Windows machine. I had expected to be able to create a key pair when I launched the instance but I was not given that option. To access the newly created instance with PuTTY you have to create a key pair using a third party tool (such as PuTTYgen) and then upload the public key to GCE. You can do this through the Cloud Console by creating an entry in the instance Metadata with a key of sshKeys and a value in the format <username>:<public_key> where <username> is the username you want to create and <public_key> is the actual value of the public key (not the filename) you create. This can be copied from the PuTTYgen dialog. A bit of extra work but arguably a better practice anyway from a security perspective.

Figure 3. Creating Metadata for the public key

After that is done it is straightforward to connect to the instance using PuTTY.

Figure 4. Connected to GCE instance via PuTTY

At this point I do not believe that Google Compute Engine is a competitive threat to established IaaS providers such as Amazon EC2, Microsoft Azure or Rackspace. To me the most compelling reason to prefer GCE over other options would be the easy integration with other Google cloud services. No doubt GCE will continue to evolve. I will check back on it again soon.

Kevin Kell

Check Out Federal News Radio’s “Ask the CIO” Segment – Sponsored by Learning Tree!

Learning Tree is proud to announce its sponsorship of Federal News Radio’s “Ask the CIO Segment!” Every Thursday morning at 10:30am, host Jason Miller interviews federal agency CIO’s about the latest directives, IT challenges and successes.

Catch this week’s segment with guest Sanjay Sardar, CIO of the Federal Energy Regulatory Commission, discussing FERC’s move to put mobility at the center of IT upgrades.

IaaS on Azure

Okay, this post may be a bit of a rant. Things should not be unnecessarily complicated … but they are.

I just wanted to launch a VM instance on Windows Azure Virtual Machines to take a look at the Visual Studio 2013 preview. Seems simple enough, right? Just log in to the Azure portal and spin up a new VM from the gallery from an image configured with VS 2013. Then, just step through the Wizard and connect with RDP.

Figure 1. Select VS 2013 Image

Give the machine a name, select instance size (small, in my case), and specify a username and password and we should be good to go.

Well, yes, and no.

Okay, the instance launches and appears to be running. As far as I know I am being charged for this resource now.

Figure 2. Instance is running

Cool. My instance seems to have a public IP address and I should be able to connect to it via RDP, login using the username and password I specified and party on. Just like Amazon EC2 and Rackspace.

But no! There is a problem.

Try and connect using RDP to actually use the instance and there is an error.

Figure 3. Denied!

So, what is going on here?

Right now I haven’t the slightest idea. Maybe it is something simple or obvious. Maybe they didn’t enable RDP on the image. Maybe I am doing something stupid.  Maybe I need to open a port or set a security group or something.  But … at this point I don’t care.  I don’t have the time or patience to troubleshoot it.

Abort!

I’m going to spin up a Windows Sever 2012 instance on Amazon EC2, install the VS 2013 preview myself and take it from there. At least I know I will have no problem connecting to the instance.  Amazon IaaS is, imho, much more straightforward to use.

Kevin Kell

Get 50% OFF Select Learning Tree Courses by Joining the New My Learning Tree Community!

We have some more exciting news to share!

Instantly receive 50% OFF select Learning Tree courses when you join the new My Learning Tree Community.

Free membership includes: weekly updates of our 50% OFF course events, access to our instructor blogs, podcasts, white papers, course materials and other valuable resources.

Feel free to drop us a comment below if you have any questions — or, follow us:

Facebook
Twitter
LinkedIn
Google+

Join today by clicking here.

Backup to the Cloud

Certainly there are a number of online backup solutions that pre-date the cloud hype of recent years. In particular Carbonite, one of the most popular, has been around since 2005. One of the attractions of cloud computing, however, is the ability to just provision the resources you need and only pay for what you use. As more or less capacity is required resources can be scaled out or in. Further, cloud computing resources can be acquired using self-service tools and do not require commitment for fixed periods of time. Also, to me anyway, there is appeal to having backup data available in a standard format and accessible via a number of tools and known APIs.

CloudBerry Lab provides a variety of useful tools for working with cloud storage. Among the most interesting is CloudBerry Drive which allows an Amazon S3 bucket, Azure Blob storage, Google, Rackspace, Open Stack and other storage providers to be added as a network mapped drive. This is very cool.

Figure 1. CloudBerry Drive

CloudBerry Lab also makes an application called CloudBerry Backup which can be used to automate backup to Amazon S3 or Glacier, Azure Blob storage, HP Cloud or Rackspace. There is a specific CloudBerry Backup version for Desktop, Server, MSSQL, MS Exchange and Enterprise.

I purchased the Desktop version of both CloudBerry Drive and CloudBerry Backup and I am extremely happy with both products. The price was reasonable, the installation painless and both are working like a charm. I now routinely work with drives mapped to both Amazon S3 and Azure Blob storage and I have no problem storing critical files and projects on those drives. In fact they are probably safer there than on my own device since both S3 and Azure Blobs are at least an order of magnitude more reliable than my own hard disk.

I am also now doing automated backups of my files (local and cloud) to Amazon Glacier. To me there is no more cost effective solution out there right now. $0.01 per gigabyte per month? That is insane (in a good way)! CloudBerry Backup makes it very easy and now I don’t even have to think about it.

Figure 2. CloudBerry Backup

So, regardless of what else you may or may not use in the cloud as a backup solution it makes a lot of sense.

To learn more about ways to exploit the benefits of cloud computing Learning Tree offers an introductory course in cloud computing technologies and a hands-on course in Amazon Web Services.

Kevin Kell

Learning Tree Opens 34 NEW AnyWare Learning Centers across North America

We have some exciting news to share! Learning Tree has officially opened 34 New AnyWare Learning Centers across North America. You can eliminate travel costs and commuting time and take our IT and management courses locally at a designated center near you via AnyWare, our web-based attendance platform that allows you to experience the same hands-on, instructor-led classroom training live, online.

To view a complete list of Cloud Computing courses, click here. If you have any questions, feel free to drop us a note in the comments or follow us on any of our social media outlets:

Facebook
Twitter
LinkedIn
Google+

Happy Training!

Windows Azure Marketplace DataMarket

As I prepare to teach Learning Tree’s Power Excel course in Rockville next week I have been taking a closer look at PowerPivot. Since the course now uses Excel 2013 we have expanded the coverage of PowerPivot which is now included with Excel. In 2010 it had been a separate add-in.

So what, you may ask, does that have to do with cloud computing? Well, as it turns out PowerPivot is really well suited to consume data that has been made available in the Windows Azure DataMarket.

The DataMarket is perhaps one of the less well known services offered as part of Windows Azure. In my opinion it has some growing to do before it reaches a critical mass. It has, however, made some impressive advancements since its inception in 2010. The DataMarket contains both free and paid-for data subscriptions that can be accessed using a variety of tools. Here I give a brief example of consuming a free subscription using PowerPivot.

The DataMarket does not appear anywhere on the Azure portal. To access it you need to create a separate account. You do that at https://datamarket.azure.com/ . Once you have established an account you can subscribe to any of the various data that have been published. You can also subscribe and use the data from your browser but I found it very easy and intuitive to subscribe to the data right from within PowerPivot.

Figure 1. Consuming Azure DataMarket data using PowerPivot

I then chose to limit my selection to just the free subscriptions. In an actual application, of course, I would be able to search for data that was relevant to the analysis I was doing. For fun I decided to look at the USA 2011 Car crash data published by BigML. When I finish clicking through the wizard the data is imported into my PowerPivot Data Model and is available for my use. Here I can correlate it with other data I have to build up my analysis dataset.

Once the data is in PowerPivot I can quickly do analyses using familiar Excel tools. I can also use the reporting capabilities of Data View in Excel 2013 to create compelling presentations of the data.

Figure 2. Analysis of Car Crash data in Excel Power View

The easy integration between PowerPivot and the Azure DataMarket gives Excel users a powerful tool to augment their data analysis. In future posts I will explore some of the other services that Microsoft is offering through Azure to further enhance and simplify analysis of very large datasets.

Kevin Kell


Learning Tree Logo

Cloud Computing Training

Learning Tree offers over 210 IT training and Management courses, including Cloud Computing training.

Enter your e-mail address to follow this blog and receive notifications of new posts by e-mail.

Join 53 other followers

Follow Learning Tree on Twitter

Archives

Do you need a customized Cloud training solution delivered at your facility?

Last year Learning Tree held nearly 2,500 on-site training events worldwide. To find out more about hosting one at your location, click here for a free consultation.
Live, online training
.NET Blog

%d bloggers like this: