Welcome to the zeroth newsletter: August 2020!

Data Science teams are typically excellent at developing Machine Learning (ML) models for their target applications, but can sometimes struggle with other aspects of the ML lifecycle, including the deployment and maintenance of said models.

This month looked at some of the basics around getting ML models up, running and 'into production', as well as the emerging discipline of MLOps (sometimes referred to as ModelOps) that aims to support this process:

Serverless ML: Deploying Lightweight Models at Scale
Deploying ML models ‘into production’ as scalable APIs can be tricky. This post looks at how Serverless Functions can make deployment easier for some applications, and gives an example project to get you started deploying your own models as Google Cloud Functions.
MLOps: Building Continuous Training and Delivery Pipelines
MLOps is an emerging engineering movement aimed at accelerating the delivery of reliable, working ML software on an ongoing basis. This post provides an intro to MLOps and gives you an example project to get you started with building your own ML pipelines using GitHub Actions and Google Cloud.

If you've never come across Serverless, or would like to learn more about the concept and technology, here's an introductory post giving you a run down of what it is, its relative strengths and weaknesses, and a short example too:

A Brief Introduction to Serverless Computing
This post introduces the concepts behind ‘serverless computing’ -- a way of quickly and easily deploying lightweight apps (e.g. APIs). It looks at the associated advantages and disadvantages of serverless, and gives a short example showing how to deploy your own serverless function to Google Cloud.

News and articles from around the web

Here's a few interesting posts on ML, software engineering and general technology from around the web that've come to my attention over the last month:

1. How TikTok recommends videos for you

TikTok – the new Chinese social media video platform – have been in the news a lot recently. This (non-technical) post cuts through the political undercurrents to give a high-level look at how TikTok go about generating recommendations for new and existing users.

How TikTok recommends videos #ForYou
TikTok’s mission is to inspire creativity and bring joy. We’re building a global community where you can create and share authentically, discover the world, and connect with others. The For You feed i

2. How to use scarce resources to perform high quality A/B tests

Running robust A/B tests is regarded as something of a gold standard for software products – both for the adoption of a piece of software by a prospective customers and by product teams working to develop new features. However, running good quality A/B tests is hard when resources are tight. This paper looks at how optimal testing strategies shift depending on the context of the test, and how small changes in these strategies could lead to improvements in productivity and innovation.

[PDF] A/B Testing with Fat Tails | Semantic Scholar
Large and thus statistically powerful A/B tests are increasingly popular in business and policy to evaluate potential innovations. We study how to optimally use scarce experimental resources to screen innovations. To do so, we propose a new framework for optimal experimentation that we call the A/B …

3. Dynamically re-posing marketing collateral

This dry-sounding paper introduces an approach for taking an image of an individual, and transforming it into a new image of the same individual in a new pose, or in a new outfit. This could change the way marketing collateral and website content is generated for a host of businesses.

Neural Re-Rendering of Humans from a Single Image
Human re-rendering from a single image is a starkly under-constrained problem, and state-of-the-art algorithms often exhibit undesired artefacts, such as over-smoothing, unrealistic distortions of the body parts and garments, or implausible changes of the texture. To address these challenges, we pro…

4. Breakthroughs in Embodied AI at Facebook

Facebook have been hard at work on advancing the field of 'Embodied AI' - the field of work focussed on developing systems that can understand and interact with the physical world much as people do. In this post, Facebook AI announce some impressive breakthroughs in developing new training and evaluation platforms (photo-realistic worlds, complete with high-resolution acoustic information), and state-of-the-art performance on benchmarking tasks. This could find its way to a device near you soon, so take a look:

New milestones in embodied AI
We’re announcing several new research milestones that push the limits of embodied AI, including the first audio-visual platform, a new framework for...

5. Extracting high performance models from randomly initialized neural networks.

The process of 'training' ML models is the process of defining a set of parameters that enable a model to perform well on a given dataset. In many cases, these processes iteratively update a parameter set (weights) based on 'feedback' from how the model is performing at a target task. This paper looks at how randomly initialized weights for deep learning models can include subnetworks that perform well on specific tasks without needing to update the weights at all:

What’s Hidden in a Randomly Weighted Neural Network?
Training a neural network is synonymous with learning the values of the weights. By contrast, we demonstrate that randomly weighted neural networks contain subnetworks which achieve impressive performance without ever training the weight values. Hidden in a randomly weighted Wide ResNet-50 we show t…

General interest

And now for some miscellaneous bits and pieces:

Why efficiency is dangerous and slowing down makes life better | Psyche Ideas
The urge to do everything faster and better is risky. Far wiser to do what’s good enough for the range of possible futures
‎The H.P Lovecraft Literary Podcast on Apple Podcasts
The H.P. Lovecraft Literary Podcast has been creating podcasts and audio productions since 2009! Each week, hosts Chad Fifer and Chris Lackey discuss a piece of weird fiction. Talented voice actors bring the text to life. Music and sound effects create atmosphere while occasional guest experts show…

And that's it for this month, thanks for reading!

If you'd like to get this newsletter direct to your inbox, remember to sign up now!