Using Cloud 9 To Explore Web Development

I’ve started going through the Getting Started guide for Google’s Material Design Components for Web (MDC Web). This is a prerequisite for Material tutorials and I’ve already run into one problem traced to an out-of-date installation of Node.js. Given that I’m learning a new piece of software, I’m sure I’ll run into more problems that require modifying my computer. As mistakes are likely due to my learning status, I’m wary of potentially messing up my main Ubuntu environment.

What I need right now is an enclosed sandbox for experimentation, and I’ve already set up virtual machines I could use for this purpose. But since this is a web-based project, I decided to go all-in on the web world and try doing it online with a web-based development environment: Cloud 9.

Cloud 9 logo colorI’ve played with Cloud 9 before, back when it was a startup with big dreams. It has since been acquired by Amazon and folded into the wide portfolio of Amazone Web Services (AWS). As far as I knew Cloud 9 has always run on AWS behind the scenes, but now a user is exposed to the underlying mechanisms. It means we now have better control over the virtual machines we’re running, which is mostly good. It also means users have to pay for those virtual machine resources, which is fairly inexpensive (I expect to pay less than $1 for this experiment) but isn’t as good as the “Free” it used to be. The saddest part is that it’s no longer “point-click-go” simple to get started. Getting Cloud 9 properly setup means climbing the learning curve for managing AWS security and permissions, which can be substantial.

Access Permissions

AWS has an entire document focused on authorization and access control for Cloud 9. For someone like myself, who just want to play with Cloud 9 but also want to safely partition it off from the rest of my AWS account, the easiest thing to do is to create a new user account within my AWS dashboard. This account can be assigned a predefined access policy called AWSCloud9User, and that’ll be enough to get started. When logged in to this dedicated account I can be confident mistakes won’t accidentally damage anything else I have in AWS.

Network Permissions

With the power of fine-grained virtual machine control comes the responsibility of configuring it to act the way we want. When I last used Cloud 9, launching a piece of web hosting software on my VM meant it was immediately accessible from the internet. That meant I could bring it up on another browser window at my desk to see how it looks. However, this is no longer the default behavior.

When running my VM in the form of an Amazon EC2 instance like now, it has its own network firewall settings to deal with. Not only that, the VM is in its own private network (Amazon VPC) which has its own network firewall settings. Both of these firewalls must be configured to allow external access if I’m to host web content (as when exploring MDC Web) and wish to see it on my own machine.

There’s a lot of documentation online for using Cloud 9. The specific configuration settings that need to be changed are found under “Previewing Running Applications” in section “Share a Running Application over the Internet

Amazon Machine Learning School Now Open

AWS logoWhen I was learning about artificial intelligence in school, knowledge was found in the form of advanced textbooks and academic journals. It takes a certain amount of dedication to understand and digest that information. Now, a few decades later, leading edge research can be found on the internet. But even better: many educational resources for people outside those academic fields of study are also available online.

These resources have just grown again, as Amazon opened up their AWS Machine Learning course to the world. Formerly a part of internal employee training, it now helps people get up to speed outside the company as well. (And naturally help convert them to become paying AWS customers.)

There are different courses for different audiences, from developers to business professionals. And they cover different parts of AWS services. When I get around to taking these classes (the to-do list never seems to grow shorter…) I’ll likely dive into the developer track for visual recognition algorithms. If I feel good about my grasp of the code, I’ll see if I can integrate an Amazon DeepLens unit into my robotics projects.

This set of AWS courses is the latest addition to a long list of web-accessible resources for learning latest tools for AI. It’s great that I don’t need to enroll in school to see how the field has evolved since the time I was at UCLA.

My First Cloud Storage Failure

Amazon_Drive_logoI count myself as a skeptic of the new cloud-based world. When I first learned of DropBox I wasn’t willing to trust it with my data. When I read about the frustration of people whose data are still trapped on MegaUpload,  I felt vindicated.

But the tide of progress moved forward and now we have many cloud-based storage providers with enough of a track record for me to dip my toes in the water. Starting in January 2016 I started using cloud-based storage for my personal needs, spreading my files across Microsoft OneDrive, Google Drive, and Amazon Drive.

That’s not to say I trust cloud-based storage yet. I still maintain my regimen of home offline backups on removable hard drives. And for the files that I feel are important, I duplicate storage across at least two of the cloud storage providers.

Almost a year and a half in, I admit I see a lot of benefit. I’ve become a lot less dependent on a specific computer as much of my files are accessible from any computer with an internet connection. I can pick up work where I left off on another computer. I can refresh and reformat a computer with far less worry I’ll destroy any irreplaceable data.

And then there are the wacky outliers. When I wanted to obtain a library card, one required proof of local residency is an utility bill. I was able to pull out my phone and retrieve a recent electric bill thanks to cloud storage.

But just like physical spinning hard drives, a failure is only a matter of time. And tonight, after almost 18 months of flawless cloud storage performance, we have our first winner. Or more accurately, our first loser. Tonight I was not able to access my files on Amazon Drive. I get the “Loading” spinner that never goes away.

The underlying Amazon storage infrastructure seems ok. (AWS S3 status is green across the board.) This must be a failure in their consumer-level storage offering, which will probably get fixed soon.

In the meantime, they have one annoyed customer.

Static Web Site Hosting with Amazon S3 and Route 53

aws_logoWeb application frameworks have the current spotlight, which is why I started learning Ruby on Rails to get an idea what the fuss was about. But a big framework isn’t always the right tool for the job. Sometimes it’s just a set of static files to be served upon request. No server-side smarts necessary.

This was where I found myself when I wanted to put up a little web site to document my #rxbb8 project. I just wanted to document the design & build process, and I already had registered the domain rxbb8.com. The HTML content was simple enough to create directly in a text editor and styled with CSS from the Materialize library.

After I got a basic 1.0 version of my hand-crafted site, I uploaded the HTML (and associated images) to an Amazon S3 bucket. It only takes a few clicks to allow files in a S3 bucket to be web-accessible via a long cumbersome URL on Amazon AWS domain http://rxbb8.com.s3-website-us-west-2.amazonaws.com. Since I wanted this content to be accessible via the rxbb8.com domain I already registered, I started reading up on the AWS service named in geek-humor style as Route 53.

Route 53 is designed to handle the challenges of huge web properties, distributing workload across many computers in many regions. (No single computer could handled all global traffic for, say, netflix.com.) The challenge for a novice like myself is to figure out how to pull out just the one tool I need from this huge complex Swiss army knife.

Fortunately this usage case is popular enough for Amazon to have written a dedicated developer guide for it. Unfortunately, the page doesn’t have all the details. The writer helpfully points the reader to other reference articles, but those pages revert back to talking about complex deployments and again it takes effort to distill the simple basics out of the big feature list.

If you get distracted or lost, stay focused on this Cliff Notes version:

  1. Go into Route 53 dashboard, create a Hosted Zone for the domain.
  2. In that Hosted Zone, AWS has created two record sets by default. One of them is the NS type, write down the name servers listed.
  3. Go to your domain registrar and tell them to point name servers for the domain to the AWS name servers listed in step 2.
  4. Create S3 storage bucket for the site, enable static website hosting.
  5. Create a new Record Set in the Route 53 Hosted Zone. Select “Alias” to “Yes” and point alias target to the S3 storage bucket in step 4.

Repeat #4 and #5 for each sub-domain that needs to be hosted. (The AWS documentation created example.com and repeated 4-5 for www.example.com.)

And then… wait.

The update in step 3 needs time to propagate to name servers across the internet. My registrar said it may take up to 24 hours. In my case, I started getting intermittent results within 2 hours, but it took more than 12 hours before everything stabilized to the new settings.

But it was worth the effort to see version 1.0 of my created-from-scratch static web site up and running on my domain! And since it’s such a small and simple site with little traffic, it will cost me only a few pennies per month to host in this manner.

 

Dipping toes in AWS via Rails Tutorial Sample App

aws_logoAmazon Web Services is a big, big ball of yarn. For somebody just getting started, the list of AWS products is quite intimidating. It’s not any fault of Amazon, it’s just the nature of building out such a comprehensive system. Fortunately, Amazon is not blind to the fact people can get overwhelmed and put admirable effort into a gentle introduction via their Getting Started resources: a series of (relatively) simple guided tours through select parts of the AWS domain.

At the end of it all, though, a developer has to roll up their sleeves and dive in. The question then is: where? In previous times, I couldn’t make up my mind and got stuck. This time around, I have a starting point: Michael Hartl’s Ruby on Rails Tutorial sample app, which wants to store images on Amazon S3 (Simple Storage Service).

Let’s make it happen.

One option is to blaze the simplest, most direct path to get rolling, but I resisted. The example I found on stackoverflow.com granted the rails app full access to storage with my AWS root credentials. Functionally speaking that would work, but that is a very bad idea from a security practices standpoint.

So I took a detour through Amazon IAM (Identity and Access Management). I wanted to learn how to do a properly scoped security access scheme for the Rails sample app, rather than giving it the Golden Key to the entire kingdom. Unfortunately, since IAM is used to manage access to all AWS properties, it is a pretty big ball of yarn itself.

Eventually I found my on-ramp to AWS: A section in the S3 documentation that discussed access control via IAM. Since I control the rails app and my own AWS account, I was able to skip a lot of the cross-account management for this first learning pass, boiling it down to the basics of what I can do for myself to implement IAM best practices on S3 access for my Rails app.

After a few bumps in the exploration effort, here’s what I ended up with.


Root account: This has access to everything, so we want to use this account as little as possible to minimize risk of compromising this account. Login to this account just long enough to activate multi-factor authentication and create an IAM user account with “AdministratorAccess” privileges. Log out of the management console as root, log back in under the new admin account to do everything else.

Admin account: This account is still very powerful so it is still worth protecting. But if it should be compromised, it can at least be shut down without closing the whole Amazon account. (If root is compromised, all bets are off.) Use this account to set up the remaining items.

Storage bucket: While logged in as the admin account, go to the S3 service dashboard and create a new storage bucket.

Access policy: Go to the IAM dashboard and create a new S3 access policy. The “resource” of the policy is the storage bucket we just created. The “action” we allow are the minimum set needed for the rails sample app and no more.

  1. PutObject – this permission allows the rails app to upload the image file itself.
  2. PutObjectAcl – this permission allows the rails app to change the access permission on the image object, make the image publicly visible to the world. This is required for use as the source field of an HTML <img> tag in the rails app.
  3. DeleteObject – when a micropost is deleted, the app needs this permission so the corresponding image can be deleted as well.

Access group: From the IAM dashboard, create a new access group. Under the “Permissions” list of the group, attach the access policy we just created. Now any account which is a member of the group has enough access the storage bucket to run the rails sample app.

User: From the IAM dashboard, create a new user account to be used by the rails app. Add this newly user to the access group we just created, so it is a part of the group and can use the access policy we created. (And no more.)

This new user, which we granted only a low level of access, will not need a password since we’ll never log in to Amazon management console with it. But we will need to generate an app access key and secret key.

Once all of the above are done, we have everything we need to put into Heroku for the Rails Tutorial sample app. A S3 storage bucket name, plus the access and secret key of the low level user account we created to access that S3 storage bucket.


While this is far more complex than the stackoverflow.com answer, it is more secure. Plus a good exercise to learn the major bits and pieces of an AWS access control system.

The above steps will ensure that, if the Rails sample app should be compromised in any way, the hacker has only the permissions we granted to the app and no more. While the hacker can put new images on the S3 bucket and make them visible, or delete existing images, but they can’t do anything else in that S3 bucket.

And most importantly, the hacker has no access to any other part of my AWS account.

Rails Tutorial (Take 2)

With all the fun and excitement around 3D printing, I’ve let my Ruby on Rails education lapse. I want to dive back in, but it’s been long enough that I felt I needed a review. Also, during my time away, the Ruby on Rails team released version 5, and Michael Hartl’s Ruby on Rails Tutorial was updated accordingly.

Independent of the Rails 5 updates, it was well worth my time to go through the book again. On second run, I understood some things that didn’t make sense before. It was also good to look at the first do-nothing “hello app” and the second automated-scaffold “toy app” with a little more Rails knowledge under my belt. The book is structured so the beginner reader didn’t have to understand the mechanics of the hello or toy apps, but readers with a bit of understanding will get something out of it.

The release notes for the update mentioned that a few sections were rearranged for better pacing and structure, and added more exercises for readers to check their progress. Both are incremental improvements that I appreciated but neither were especially earth-shattering.

Action Cable, one of the big signature feature of Rails 5, was not rolled into the book. Hartl is handling that in a separate tutorial Learn Enough Action Cable to be Dangerous which I will go through at some point in the near future.

Towards the end of the book, Hartl introduced an optional advanced concept: using Amazon Web Services to store user image uploads. I skipped that section the first time through, and decided to dive into it this time.

I quickly found myself in a deep rabbit hole. Amazon Web Services has many moving parts designed for a wide range of audiences and it’s a challenge to get started without being overwhelmed.

Which is where I am now. Lots more exploration ahead!