8/12/2019
Researchers, developers, and engineers gathered at Facebook’s Menlo Park campus last week for the first of two PyTorch Summer Hackathons. Following the release of PyTorch 1.2, participants formed teams and spent two days building machine learning (ML) projects with some of the latest features available in the open source ML framework, including revamped domain libraries and improved ease of shipping production models.
In this post, we highlight some of the winning projects from the Summer Hackathon at Menlo Park, and share details on how you can participate in the online Global Summer Hackathon, which is now open for submissions.
With PyTorch 1.0, we introduced TorchScript to provide a seamless path from research prototyping in eager execution mode to production deployment in graph mode. TorchScript works by compiling or tracing Python code to a Python-free, statically typed graph representation that can be optimized and executed in production environments.
PyTorch 1.2 brings a more polished TorchScript environment that streamlines the exporting of models containing control flow, making it easier to transition code for production deployment. Here’s an example of the new torch.jit.script() function/decorator.
import torch class MyModule(torch.nn.Module): def __init__(self, N, M): super(MyModule, self).__init__() self.weight = torch.nn.Parameter(torch.rand(N, M)) def forward(self, input): if input.sum() > 0: output = self.weight.mv(input) else: output = self.weight + input return output # Compile the model code to a static representation my_script_module = torch.jit.script(MyModule(3, 4)) # Save the compiled code and model data so it can be loaded in C++ my_script_module.save("my_script_module.pt")
Alongside these improvements, we released new versions of the domain libraries for computer vision, natural language processing, and speech/audio. The torchvision, torchtext, and torchaudio libraries provide convenient access to common datasets, transformations, and state-of-the-art models, allowing researchers and engineers to speed up development within these domains.
Visit the PyTorch Blog to learn more about PyTorch 1.2.
Researchers and developers from the PyTorch community joined the two-day PyTorch Summer Hackathon at Menlo Park to build applications and models designed to make a positive impact on people and businesses.
We would like to thank all the participants who joined the hackathon both locally and from around the world to work on ideas that tackled areas from astrophysics to education. It was really exciting to see the variety of innovative ideas and the speed of development. Here are a few of the winning projects:
Learn2learn is a PyTorch library designed to make meta-learning more accessible to ML developers. It includes implementations of MAML and Meta-SGD, as well as a Task Generator for easily creating a distribution to learn from.
HelloWorldNet applies PyTorch to improve the speed and reliability of detecting extrasolar planets. It extends Exonet and Astronet and provides dataloaders for sources such as Kepler, TESS, and K2.
MineTorch is a programming platform for children to learn about and integrate deep learning models into their projects. It provides a drag-and-drop user interface that generates Python code.
Following the two-day hackathon in Menlo Park, we are announcing our Global Summer Hackathon — an online hackathon that makes it easy for developers around the world to participate. The focus of this hackathon is on using PyTorch to build creative, well-implemented solutions that can create a positive impact on businesses and people. The solution could be an ML model, an application, or a creative project (such as art or music).
Participants can submit their projects between now and September 16 for consideration in over $60,000 in cash prizes and the opportunity to attend and share their projects at the PyTorch Developer Conference on October 10, 2019.
Join the online Global Summer Hackathon and get started here.
Product Manager, Facebook AI
Software Engineer, Facebook AI Research
Foundational models
Latest news
Foundational models