Demo: Augmented Reality view of live micro:bit sensor data
By Rodhán
- 2 minutes read - 344 wordsI was thinking about how to display live sensor readings from a micro:bit in an augmented reality iPhone app, and the first method that I came up with was going to be cumbersome to say the least. My plan was to encode the readings somehow in a 5x5 matrix and then display that as a pattern on the micro:bit LEDs, which could be read and then decoded with some computer vision code in the iPhone app. What a nightmare that would have been.
Then I remembered that on top of all of the other amazing awesomeness that is somehow squeezed into there micro:bit, there is a full Bluetooth LE stack. Simply communicating our sensor readings over Bluetooth is going to be a lot easier!
Next, I thought I was going to have to spend some time reading docs and fighting with the intricacies of Bluetooth LE to get something working, but it turns out I was wrong again. There is Microbit-Swift, a beautifully crafted, and really well-documented Swift library that can just be dropped into any Xcode project, and more or less just works.
And so, I created this simple demo:
How I Made This
I started with this sample project from Apple that detects images, and I made just a few modifications to get it to detect a marker image that I created, to track the marker as it moves in the scene, to connect to the micro:bit over Bluetooth and read the temperature value, and to display the temperature over the marker.
Next I created a new micro:bit MakeCode project, added the Bluetooth extension and these blocks:
Note: I could have read the Temperature directly over Bluetooth but decided to use Events to transmit the data from the micro:bit to the iPhone just to demonstrate the flexibility of this approach.
And that is pretty much all there is to it! It’s a very quick proof of concept but now that I know that it is possible I’m sure I’ll come up with all sorts of crazy ideas for combining AR and micro:bits.