We finally got our LEDs allocation over migB. We are going to use the same principle as explained in a previous post (article and video). We will also put one RGB LED on each pole to reduce the obscurity. The main issue with this allocation is the conversion of coordinates. With aligned LEDs the conversion is quite easy. Spherical coordinates can be easily projected into a Cartesian system which will govern our strips.
We also thought about a spiral wrapped around migB or parallel circles mapping a sphere. However it is very difficult to manage coordinates.We thought about a mapping matrix to deal with led position and solid angles, but it requires a calibration and a high similitude in the way the strips are mounted in the sphere. More difficult, imagine you would like to color the LED which point out a certain direction. It becomes really hard. We chose simplicity over homogeneity. This spacial allocation will also strongly help the understanding of the displayed symbols by the player.
The numbers of LEDs will be between 50 and 100 for each migB according to the final size, in order to keep the price and consumption reasonable.
Charles, Flo and me were looking from technological solution. I was in charge for the mobile platform. I visited the Robotshop website. My constraints were : small (less than 12cm), dynamic (fast) and with chain.
I found this http://www.robotshop.com/en/mini-robotshop-rover-chassis-kit.html and I should investigate whether I can add an incremental encoder. Alexis gave me the TutoBot (2012 project) git access in order to study how they made their mobile platform.
We were together the whole afternoon in order to define the PSSC. By the way, we changed a little bit the gameplay. The part of autonomous robot was too difficult, thus we prefered do a neutral robot which move at random in the arena and shoot. We did the slides for the presentation.
Last night, we almost completed our first presentation for ROSE. We did the product presentation, the techs we will use and we planned the different tasks. Even if it is hard because it is difficult to evaluate how many time a task will take, we tried to find something the most homogenous possible.
Finally, we thing that Wi-fi is a good choose for our bear because we need a speed sufficient enought to stream audio and we would like to connect easily Kudly to the internet. Bluetooth is slower and with that technology, we needed to add a bluetooth hotspot to connect the bear to the network.
We also abandonned the idea of the led screen as it will consum to mush. Instead, we will simply put leds to make Kudly more attractive and to be able to implement some pedagogic games (color regognition for example).
You can also follow us on twitter if you are interrested : https://twitter.com/KudlyProject
Today we choose the components we will use for the project. We also took a look at how the DTW algorithm works as if it’s implemented trying to keep all the data, it takes a lot of place ,in our case as we will make comparison on 6 dimensions, we would need 1 Mb, fortunately we can reduce this number to a few Kb. We decided to use a Cortex-M4 processor with a FPU, we also made a choice for a lithium battery and thought about the design this would imply (along with other components for the sensors and the leds) .Last but not least, we made the list of the PSSCs and discussed a bit about how to have a well-organized team work.
Now we have designed the (almost) final game scenario. In fact I’m rather impatient to play it now !
I’ve searched throught Sparkfun to see how we could power our system using usb. I think I have a few idea, moreover a lot of Sparkfun components’ diagrams are online, so we could just inspire from them.
At last, we discussed about the pssc and found a minimum gameplay to reach as a first main goal, just hope that we can achieve it ! Using trello is very convenient, I’d already used it during BDE campaign, I’m just wishing it will give us more luck.
These last days were dedicated to prepare the first presentation which will occure in a few hours ! For that, we had to precise the functionnality the bear will have. And then select techonolgy and composants to realise them.
The previsionnal calendar is established, hope we will meet the deadlines….
Good news Everyone !
Tomorrow morning we’ll be presenting our project concept and guideline in class
We defined how the different modules in our system will interact, their functionalities and use, and synthesize them into PSSCs to distribute them to one responsible in the group.
We also scheduled the dead-lines of the different PSSCs, I hope we’ll have time to do everything we want or a least the core ideas to make our project fun and work !
For the components, we looked for a particular wi-fi stream video module, we first checked the ESP8266 to see if it could fir for us, but we won’t be able to stream video between the camera and the wi-fi module with an UART. We are know thinking about taking a Texas CC3000, with an SPI.
See you next time
Yesterday and today, we finally chose how we are going to stream audio from the bear. At first we were thinking about a bluetooth module capable of sending sound without using the processor. But we also need to process the sound to detect when the baby cries or wakes up, and having both the bluetooth module & the processor plugged to the microphones and the speaker bring some issues.
We are then going to get the sound through the processor, and then send it. And finally, for higher speed purposes, data will be transfered though wifi and not bluetooth.
We also finished to set the PSSC for our project and the calendar.
In order to keep our migB smooth and beautiful (without ugly connectors), we plan to implement wireless charging. We are going to use the Qi standard, the most known nowadays. Basically we will connect a power regulator to a receptive coil (input) and to a lithium battery charger (output). The regulator and the charger may be in a single chip. Thus, the energy can transit properly from the coil to the battery.
Moreover, this is not really expensive. I think we can do it for less than 20€. I’m careful, the limit is high.
You can easily implement wireless charging on your smartphone. Many tutorials can by found on the web. You can also buy ready-to-use coils providing a regulated 5V voltage over a micro-USB port. Pretty easy.
Today was a good day! We’ve decided what it is that our HeRos will do exactly.
– in “normal” mode: each HeRos is controlled by a user via smartphone. We’ll stream via WiFi what the camera sees directly to the smartphone (by WiFi Direct or normal WiFi, we don’t know yet). There will be no video processing after all, because the WiFi bandwidth will be enough to stream video, and the HeRos’ autonomous mode won’t be what we initially intended. The basic shots will be done with an IR laser diode, the 360° shots (and healing) will be done with an IR emitter under a diffusive dome at the top of each robot. Each robot will also have four “targets” on its surface (IR sensors with diffusive dome to increase target area) at which a direct shot has to be aimed and 3 large angle IR sensors that will detect proximity 360° shots or healing.
– in “berserk” mode: a robot can be set in this autonomous mode: the camera is turned off, and the robot isn’t controlled by a user any longer. Instead it roams the battlefield pseudo-randomly, avoiding obstacles along the way (with infrared or ultrasonic sensor, haven’t decided yet, which is better in your opinion?). While it moves, it regularly shoots a 360° proximity shot and long bursts of direct shots (IR laser diode) without aiming. For short periods of time, the 360° shooting turns into 360° healing. Basically in this mode the HeRos turns into a moving mine.
– the secondary modules will be small components containing only one element: a magnet. When “plugged” into the HeRos, each magnet will be detected by a Hall effect sensor. Our vision is: three different slots on the surface of the HeRos, each slot corresponding to a level of power: low, medium, high, and each slot having a different shape. In each slot a module of the right level (i.e. a module that fits) can be plugged. Depending on the orientation of the magnet within, it is (and is detected as such) either a defensive or an offensive module. The info from the hall sensors is then sent back to the phone and corresponding abilities are unlocked for the HeRos.
We’ve been looking for the right components to do all this (especially a camera whose DMA can connect directly to the SPI to feed the WiFi module) and we’re still looking for the right moving base (the pololu doesn’t allow us to implement odometry so we have to look for something else).
We’ve also prepared for tomorrow’s presentation of the project!
Until next time!