Dillon Erb (@dlnrb, CEO @HelloPaperSpace) talks about what exactly is MLOps, Serverless AI platforms, and how developers can utilize GPUs for AI/ML.
*SHOW SPONSOR LINKS:
- Datadog Security Monitoring Homepage - Modern Monitoring and Analytics
- Try Datadog yourself by starting a free, 14-day trial today. Listeners of this podcast will also receive a free Datadog T-shirt
- Taos Homepage
- Taos - Gartner MQ - Cloud Professional Services
- Studio 3T - Homepage
CLOUD NEWS OF THE WEEK - http://bit.ly/cloudcast-cnotw
PodCTL Podcast is Back (Enterprise Kubernetes) - http://podctl.com
- PaperSpace website
- Covid-19 Blog
Topic 1 - Dillon, welcome to the show, tell us a little bit about yourself and how you got involved in this space?
Topic 2 - I’ve had a running joke on the show that a market doesn’t exist until you attach Ops to it. Today we’ll talk about MLOps. Give everyone an introduction for those not familiar.
Topic 3 - What exactly is a Serverless AI Platform? How does this differ from traditional CI/CD platforms that our listeners would be used too? Is this abstracting away the infrastructure layer for MLOps teams?
Topic 3a - Switching gears from Ops to Developers, what do you mean when you say that you make it easy for developers to use GPUs? What do developers need to know about hardware-level stuff like GPUs that they didn’t need to know with CPUs?
Topic 4 - As with all things emerging tech, the use cases are constantly evolving. What are the early initial use cases that you are seeing? Are there unique things that emerge for gaming or media applications? **
Topic 5 - How does access to data models fit into all of this?
Topic 6 - I noticed your company did some articles on Covid-19, can you explain what is going on there?
- Email: show at thecloudcast dot net
- Twitter: @thecloudcastnet