Amazon AWS re:Invent Round-Up…

At the Amazon AWS re:Invent event last week, Amazon made a slew of announcements relating to new AWS service offerings. Here’s a quick round-up of some of the things I noticed, with links to announcement blog posts rather than the actual services themselves…

First up, AWS Cloud9,  a browser based Integrated Development Environment (IDE) for writing, running, and debugging code. AWS have been moving into developer and productivity tools for some time, and this is another example of that.

For the non-developer,  Amazon Sumerian may be of interest, providing a range of tools and resources that allow anyone to create and run augmented reality (AR), virtual reality (VR), and 3D applications. The interface is a GUI driven one, so it’ll be interesting to see what Amazon have made of it compared to the horrors of their developer service UIs…

Whilst text editors are the preferred environment by “real” developers, many of the rest of us find Jupyter notebooks a more accommodating environment. So it’s interesting to see Amazon using them as part of their SageMaker service, a fully managed end-to-end machine learning service that enables data scientists, developers, and machine learning experts to quickly build, train, and host machine learning models at scale. The notebooks look to be used for data exploration and cleaning, whilst other components include model building, training, and validation using Docker containers, with model hosting that can also provide A/B testing of multiple models simultaneously.

On the back end, AWS already offer a range of database services, but now there’s also a graph database called Neptune. Graphs provide a really powerful way of thinking about and querying datasets – they really should be taught more and at an earlier stage computing education – so it’s nice to see support for graph databases growing.

I’m not sure how closely Neptune ties in to the new AWS AppSync, a fully managed serverless GraphQL service for real-time data queries, synchronization, communications and offline programming features? Skimming the announcement blog post it looks as if it ties in to DynamoDB, so I’m not sure? Perhaps this is more about using it as a responsive data query language and server-side runtime for querying data sources that allow for real-time data retrieval and dynamic query execution?

Whenever I try to use an Amazon Web Service, I find myself in the middle of configuration screen hell. When I saw the AWS Fargate service announcement open with the claim that [a]t AWS we saw [container management solutions] as an opportunity to remove some undifferentiated heavy lifting, it made me laugh. I find it easy to launch containers at places like Digital Ocean via Docker machine, but now perhaps things will be easier on AWS?

Of course, containers are still virtualised offerings – maybe you really want access to the bare metal of servers you probably don’t have hanging around at home?

  • Processing: two Intel Xeon E5-2686 v4 processors running at 2.3 GHz, with a total of 36 hyperthreaded cores (72 logical processors);
  • Memory: 512 GiB;
  • Storage: 15.2 terabytes of local, SSD-based NVMe storage;
  • Network: 25 Gbps of ENA-based enhanced networking.

That do you?

If you aren’t taken by the idea of running your data through AI models running in the cloud, even if it is on your own bare metal rented servers, you might fancy running them locally. DeepLens isn’t just a video camera with a 4 megapixel (1080P) camera and 2D microphone array, with 8GB of memory, Ubuntu 16.04 that can download prebuilt models and run your data through them. If it’s anything like the AI packing Lockheed Martin F35 stealth jet, I’m not sure what it phones home, though?

Who remembers dialling the speaking clock when they were a kid? Here’s a modern day version of a universal time signal that can help you keep distributed things in synch with each other. The new  Amazon Time Sync Service provides a time synchronization service delivered over Network Time Protocol (NTP) which uses a fleet of redundant satellite-connected and atomic clocks in each region to deliver a highly accurate reference clock. NTP has been around for ages, but providing a reliable service as just another AWS service call helps build lock in to the AWS ecosystem. (In part this reminds me of Google making a bid for DNS domination several years ago.)

Although Amazon have until now shied away from offering their own operating system, (Fire OS never really went anywhere), I wonder if they are making a play for the internet of things with Amazon FreeRTOS, an IoT microcontroller operating system … that extends the FreeRTOS kernel, a popular real-time operating system, with libraries that enable local and cloud connectivity, security, and (coming soon) over-the-air updates. Hmm… Android wasn’t a Google initiative originally…

And if they can’t get you to use their IoT O/S, maybe you will avail yourself of the IoT Device Defender. Details look light on this at the moment, but this puts a marker in the sand…

With Amazon Echo, Amazon made an early play for voice devices in the home. One of the benefits of getting services out there and used by folk means you have more training data to feed your AI services. So it’s perhaps not surprising that there’s a push on a new batch of voice related services:

  • Amazon Comprehend, a continuously-trained trained Natural Language Processing (NLP) service. Features are typical of this sort of service, as offered already by the likes of Google, Microsoft and IBM: language detection, entity detection, sentiment analysis, and key phrase extraction, but I’m not sure I’ve spotted topic modeling as a service before (but then, I haven’t been looking);
  • Amazon Translate is a high-quality neural machine translation service that … provide[s] fast language translation of text-based content;
  • Amazon Transcribe is an automatic speech recognition (ASR) service. Apparently, audio files stored on the Amazon Simple Storage Service (S3) can be analysed directly, with timestamps provided for each word and inferred punctuation

It wasn’t so very long ago that YouTube hadn’t event been imagined yet. But the pace of change is such that if you want to build your own, you probably can do, complete with monetisation services. AWS Media Services is an array of broadcast-quality media services, offering:

  • file-based transcoding for OTT, broadcast, or archiving. Features apparently include multi-channel audio, graphic overlays, closed captioning, and several DRM options;
  • live encoding to deliver video streams in real time to both televisions and multiscreen devices. Support for ad insertion, multi-channel audio, graphic overlays, and closed captioning;
  • video origination and just-in-time packaging that takes a single input and produces output for multiple devices, with support for multiple monetization models, time-shifted live streaming, ad insertion, DRM, and blackout management;
  • media-optimized storage that enables high performance and low latency applications such as live streaming;
  • monetization services that support ad serving and server-side ad insertion and accurate reporting of server-side and client-side ad insertion.

Of course, if you don’t want to become an over the top TV broadcaster, you could always use a couple more of the  new video services as a part of your own state surveillance system.

For example, Amazon Kinesis Video Streams can ingest streaming video (or other time-encoded data) from millions of camera devices without having to set up or run your own infrastructure. Hook that into to your public traffic cams and process it…

…perhaps with Amazon Rekognition Image, which provides scalable image recognition and analysis, with object and scene detection, real-time facial recognition, celebrity recognition and text recognition. Apparently, it’s the first video analysis service of its kind that uses the complete context of visual, temporal, and motion of the video to perform activity detection and person tracking. Oh good…

Author: Tony Hirst

I'm a Senior Lecturer at The Open University, with an interest in #opendata policy and practice, as well as general web tinkering...

%d bloggers like this: