
Microsoft
AI/Machine Learning + Edge Computing for Retail & Open Source Tutorial
Empowering Edge Intelligence
Projects completed by the inaugural Microsoft Garage Silicon Valley Intern Team, Summer 2019
View more working details
Project Type
Open Source Tutorial & a Retail Solution
Duration
12 weeks, Summer 2019
Tools
Microsoft Office Suite
Team
1 UX Designer and 5 Software Developers
Design Digest
Design Digest
As the primary UX writer and user tester on the inaugural Microsoft Garage intern team in Silicon Valley, I collaborated with a team of four intern software developers and partnered closely with stakeholders from Azure Machine Learning, ONNX Runtime, and Azure Data Box Edge. Our goal was to help bring machine learning to the edge by making highly technical workflows more approachable and accessible for developers working with constrained hardware.
I co-authored an open-source tutorial for deploying ONNX models on the NVIDIA Jetson Nano using Azure IoT Edge. My focus was on rewriting complex technical documentation with clarity, inclusivity, and usability in mind—ensuring developers of all backgrounds could implement edge AI solutions with confidence. Our work contributed to the ONNX Runtime 0.5 release, which introduced support for hardware acceleration at the edge, and the tutorial is now part of Microsoft’s official tooling to help democratize AI and machine learning at scale.
Integrate Azure with Machine Learning on the Jetson Nano
The Microsoft Garage intern program brought together a cross-functional team to explore practical applications of AI and machine learning at the edge. Our challenge: to create a developer-friendly, open source solution that integrates Azure Machine Learning with edge devices, specifically the NVIDIA Jetson Nano.
Integrate Azure with machine learning execution on the NVIDIA Jetson platform (an ARM64 device)
In this tutorial you will learn how to integrate Azure services with machine learning on the NVIDIA Jetson device (an ARM64 hardware platform) using Python. By the end of this sample, you will have a low-cost DIY solution for object detection within a space and a unique understanding of integrating ARM64 platform with Azure IoT services and machine learning."
Discover
Discover
We researched barriers developers face when deploying machine learning models on edge hardware—particularly ARM64-based devices. Through stakeholder interviews and reviews of technical documentation, we uncovered that setup complexity, hardware-specific quirks, and lack of streamlined tutorials were major blockers.
Define
Define
We identified an opportunity to create a modular, open source tutorial that demystifies edge AI deployment. The target audience was AI developers and IoT engineers looking to accelerate inference on lightweight, GPU-enabled edge devices using familiar Microsoft tools.
Design
Design
As the UX designer, the deliverables for this internship project were a little different than I had previously contributed to. It was exciting to have led the design of the tutorial structure and supporting assets as a content strategist and copywriter. I mapped the process, optimized developer experience through UX writing, and worked closely with PMs and engineers to ensure clarity, accessibility, and technical accuracy. I also advocated for inclusive documentation practices to support diverse developer backgrounds.
To ensure technical precision, we consulted regularly with engineers and program managers from our stakeholders on the Azure Machine Learning, ONNX Runtime, and Azure Data Box Edge teams. These experts helped validate our instructions, confirmed compatibility across platforms, and advised on best practices for deploying ONNX models using Azure tools.
I led the drafting and UX writing of the tutorial, focusing on:
Clear labeling and structure of the steps (e.g., container setup, model export, edge deployment)
Explaining complex workflows using plain language
Anticipating developer pain points and surfacing troubleshooting tips
This setting allowed us to:
Watch first-time users navigate the tutorial end-to-end
Get feedback from developers with varied levels of familiarity with ONNX, Azure IoT Edge, and embedded devices
Validate that the tutorial performed reliably across multiple hardware setups
Deliver
Deliver
We released the open source tutorial on GitHub: ONNX Runtime IoT Edge with Azure and Jetson Nano. The solution uses Dockerized containers to streamline deployment, leverages ONNX Runtime with TensorRT for GPU acceleration, and captures telemetry for remote monitoring via Azure.
Debrief
Debrief
Our open-source tutorial was featured in the ONNX Runtime 0.5 release and highlighted in a Microsoft Open Source spotlight, demonstrating a commitment to making edge AI more accessible. Validated on the NVIDIA Jetson Nano, the tutorial now serves as a reference implementation for deploying machine learning models on IoT devices. This cross-functional effort underscored the importance of collaboration, inclusive content strategy, and user-centered design in developer tooling.
During my internship, I grew as a UX designer, accessibility advocate, and content strategist. I learned how to communicate across technical and non-technical roles, design for a variety of learning styles, and champion clarity in complex technical documentation. Most importantly, I saw firsthand how vital user testing is—helping us uncover blind spots and improve the experience in ways we couldn’t have anticipated on our own.