I’ll be going to MS Ignite this year and speaking on Virtual Reality and Azure. Come check it out.
Going to Microsoft Ignite
So it is finally time to retire my previous set of presentations and put some new ones together. There are some major categories I need to cover:
This post is a way for me to put out some ideas on what to speak on and edit them as time passes and I get feedback. So here are my initial attempts:
So in this talk I want to show some command control scenarios and some data ingestion. To do this I will show a scenario where a mobile device running a Xamarin application will control an LED or a servo or a relay or some other visible actionable device. Then on the data ingestion side, a set of devices consuming physical data and broadcasting them over the air via Bluetooth. The mobile app will consume that data and showcase it in the application.
This talk can reuse the Bluetooth devices from the IoT and mobile talks. Using those, a gateway can have a module configured to consume the device data. With that I can show how to make a module and deploy that module to the gateway. Then I will add a stream analytics module from the edge to consume the data internally to the application.
This is a talk very similar to the Azure IoT Edge talk except instead of focusing on creating the module in code, I focus on the architecture of the gateway and show how to create an edge module in stream analytics. The data consumed will most likely be the same Bluetooth devices as above.
This talk will take a little different set of hardware. Video is such a game changer when it comes to machine learning. It really shows how powerful machine learning is. To accomplish that demonstration will require a video input. I would love to be able to use the HoloLens but I think I will settle for an easy to use camera. Once the camera is streaming data to a gateway, I will try to push a ML algorithm for computer vision down to the gateway to consume the data and output any detection events.
I would like to continue to be able to show the different ways an application can consume data and interact with the user and the environment without a screen. To do this, the following interfaces should be shown:
This is one I really need to think on. I would like to get a combination of each input to each output.
One presentation should be on how Fusethru’s Fuseworks technology can greatly reduce the computing cost of distributed IoT data. In the demo, I will showcase a large amount of data moving through the Fuseworks application on a Multitech box using different incoming sensors to showcase how easy to use and how the application uses its existing power to accomplish big data queries.
The simplest demo. Go thorough the different styles of Azure IoT architecture and showcase a use case for each.