The Past, Present, and Future of Deep Learning In PyTorch
March 10th, 2019
42 mins 12 secs
About this Episode
The current buzz in data science and big data is around the promise of deep learning, especially when working with unstructured data. One of the most popular frameworks for building deep learning applications is PyTorch, in large part because of their focus on ease of use. In this episode Adam Paszke explains how he started the project, how it compares to other frameworks in the space such as Tensorflow and CNTK, and how it has evolved to support deploying models into production and on mobile devices.
- Hello and welcome to Podcast.__init__, the podcast about Python and the people who make it great.
- When you’re ready to launch your next app or want to try a project you hear about on the show, you’ll need somewhere to deploy it, so take a look at our friends over at Linode. With 200 Gbit/s private networking, scalable shared block storage, node balancers, and a 40 Gbit/s public network, all controlled by a brand new API you’ve got everything you need to scale up. And for your tasks that need fast computation, such as training machine learning models, they just launched dedicated CPU instances. Go to pythonpodcast.com/linode to get a $20 credit and launch a new server in under a minute. And don’t forget to thank them for their continued support of this show!
- Visit the site to subscribe to the show, sign up for the newsletter, and read the show notes. And if you have any questions, comments, or suggestions I would love to hear them. You can reach me on Twitter at @Podcast__init__ or email email@example.com)
- To help other people find the show please leave a review on iTunes and tell your friends and co-workers
- Join the community in the new Zulip chat workspace at pythonpodcast.com/chat
- Check out the Practical AI podcast from our friends at Changelog Media to learn and stay up to date with what’s happening in AI
- You listen to this show to learn and stay up to date with the ways that Python is being used, including the latest in machine learning and data analysis. For even more opportunities to meet, listen, and learn from your peers you don’t want to miss out on this year’s conference season. We have partnered with O’Reilly Media for the Strata conference in San Francisco on March 25th and the Artificial Intelligence conference in NYC on April 15th. Here in Boston, starting on May 17th, you still have time to grab a ticket to the Enterprise Data World, and from April 30th to May 3rd is the Open Data Science Conference. Go to pythonpodcast.com/conferences to learn more and take advantage of our partner discounts when you register.
- Your host as usual is Tobias Macey and today I’m interviewing Adam Paszke about PyTorch, an open source deep learning platform that provides a seamless path from research prototyping to production deployment
- How did you get introduced to Python?
- Can you start by explaining what deep learning is and how it relates to machine learning and artificial intelligence?
- Can you explain what PyTorch is and your motivation for creating it?
- Why was it important for PyTorch to be open source?
- There is currently a large and growing ecosystem of deep learning tools built for Python. Can you describe the current landscape and how PyTorch fits in relation to projects such as Tensorflow and CNTK?
- What are some of the ways that PyTorch is different from Tensorflow and CNTK, and what are the areas where these frameworks are converging?
- How much knowledge of machine learning, artificial intelligence, or neural network topologies are necessary to make use of PyTorch?
- What are some of the foundational topics that are most useful to know when getting started with PyTorch?
- Can you describe how PyTorch is architected/implemented and how it has evolved since you first began working on it?
- You recently reached the 1.0 milestone. Can you talk about the journey to that point and the goals that you set for the release?
- What are some of the other components of the Python ecosystem that are most commonly incorporated into projects based on PyTorch?
- What are some of the most novel, interesting, or unexpected uses of PyTorch that you have seen?
- What are some cases where PyTorch is the wrong choice for a problem?
- What is the process for incorporating these new techniques and discoveries into the PyTorch framework?
- What are the areas of active research that you are most excited about?
- What are some of the most interesting/useful/unexpected/challenging lessons that you have learned in the process of building and maintaining PyTorch?
- What do you have planned for the future of PyTorch?
Keep In Touch
- In Praise Of Copying by Marcus Boon
- University of Warsaw
- Polish Olympiad In Informatics
- Deep Learning
- Automatic Differentiation
- Torch 7
- Tensorflow 2
- EPFL (Ecole polytechnique fédérale de Lausanne)
- Transfer Learning
- Reinforcement Learning
The intro and outro music is from Requiem for a Fish The Freak Fandango Orchestra / CC BY-SA