Making parallel computing easy to use has been described as “a problem as hard as any that computer science has faced”. With such a big challenge ahead we need to make sure as many people as possible have access to open parallel hardware and development tools.
Inspired by great hardware communities like Raspberry Pi and Arduino, we also see a critical need for a truly open, high-performance computing platform that will enable us to close the knowledge gap in parallel programming.
The goal of the Parallella project is to democratize access to parallel computing through providing an affordable open hardware platform and open source tools, and supporting learning and the development of software which is able to harness the power of parallel systems.
The purpose of this post is to demonstrate how to add a logo to your time-lapse videos. The logo can be any graphic you can create and could easily be a watermark or text. The objective was to add my Raspberry Pi Spy logo to the bottom corner of a video at the same time I rendered my timelapse videos from the source images.
Raspberry Pi Camera Board – The Raspberry Pi Camera Module is a custom designed add-on for Raspberry Pi. It attaches to Raspberry Pi by way of one of the two small sockets on the board upper surface. This interface uses the dedicated CSI interface, which was designed especially for interfacing to cameras. The CSI bus is capable of extremely high data rates, and it exclusively carries pixel data. (read more)
The project has been so much fun that I decided to build another one, but with a cheaper and more efficient hardware by choosing the Raspberry Pi.
Working with the Adafruit WebIDE: This was no problem at all. I left the IDE enabled in the radio. When the radio is booting, the Python scripts are executed right from the workspace which allows me to make minor fixes afterwards.
I’m a child of the 1980s. Miami Vice! Skinny ties! Big hair! And every town had at least one good video game arcade.
Thanks to the super affordable Raspberry Pi and some clever software, anyone can re-create the classic arcade experience at home. Adafruit brings the genuine “clicky” arcade controls, you bring the game files and a little crafting skill to build it.
A basic control setup can be assembled without any soldering! Jumpers plug straight into the Raspberry Pi GPIO header.
I just finished a blog post describing how to use a RaspberryPi and a Pebble watch to control your home theater. Now that the Pebble watch SDK has been released I thought it would be fun to connect my RaspberryPi project (the Open Source Universal Remote http://opensourceuniversalremote.com) with my Pebble watch.
I bought most of the electronics parts from you, and I couldn’t have made it this far without the guides and helpful instructions. Keep up the great work!
This is the first guide (to date) that can be used by kids how to connect and use a RFID with the raspberry pi.
It has been a long time. I’ve been three months without writing here (afk for a while). Now I’m back and I joined a new makerspace (Made at Mob) that just opened here in Barcelona and I’m now committed to finish what I started: building an open source kiosk with the raspi. And now I’m done working at home, I can finally play with stuff while surrounded by amazing people and things.
I’m back with an exciting tutorial that will teach you how to read RFID tags with the raspberry pi. This opens new horizon to what we can do with our tiny little cheap friend.
PN532 NFC/RFID controller breakout board – v1.3 – The PN532 is the most popular NFC chip, and is what is embedded in pretty much every phone or device that does NFC. It can pretty much do it all, such as read and write to tags and cards, communicate with phones (say for payment processing), and ‘act’ like a NFC tag. If you want to do any sort of embedded NFC work, this is the chip you’ll want to use! (read more)
Here’s a followup to the electronic doorplate project that Andrea, a student at the University of Naples Parthenope, shared with us last fall. This time, he has switched from Arduino to RasPi to create a nice stand-alone photo/video display on the doors of the offices to make it clear what study group, professor, office hour, contact information, etc., might be handy — updated in real time. Check out the Tuco project:
NOTE: The original post is in Italian (below), so this is a convenient Google translation to get you most of the way there:
Tuco 2.0 is an electronic nameplate on the door: “intelligent” open source and open hardware.
It ‘was designed to be installed on the doors of our university studies.
The purpose of the label is to display real-time information such as: number of the study, teachers, office hours, contacts, mail and / or phone etc etc … but especially alerts taken in real time from one of the communication channels of the university.
Tuco 2.0 is the evolution of the prototype published precendemente, this version uses Raspberry PI, minicomputers designed in the UK by ‘Raspberry PI Foundation. ”
This version differs from the previous one as the basic software is an operating system based on Linux.
The scheme of operation consists:
Power supply via the mains
Bootstrap the system
Automatic opening of the web page (in fullscreen), stored on the server of the university
And in (the original) Italian!
Tuco 2.0 è una targhetta elettronica a porta: “intelligente”, open source e open hardware.
E’ stata concepita per essere installata sulle porte degli studi della nostra università.
Lo scopo della targhetta è visualizzare in tempo reale informazioni del tipo: numero dello studio, docente/i, orari di ricevimento, contatti mail e/o telefonici etc etc… ma soprattutto avvisi presi in tempo reale da uno dei canali di comunicazione dell’università.
Tuco 2.0 è l’evoluzione del prototipo pubblicato precendemente; questa versione sfrutta Raspberry PI, minicomputer ideato in UK dalla «Raspberry PI Foundation».
Questa versione si distingue dalla precedente, in quanto il software di base è un sistema operativo basato su Linux.
Lo schema di funzionamento consiste:
Alimentazione tramite rete elettrica
Bootstrap del sistema
Apertura automatica della pagina web (in fullscreen), memorizzata su server dell’ateneo
There’s no shortage of novel Kickstarter projects that aim to change how we think about the environment, but here’s one that could literally change how we look at it. Infragram, created by the civic science-minded folks at Public Lab puts low-cost infrared cameras into people’s hands so they can better understand the health of the plants around them.
The goal here is simple enough — by hacking these cameras to peer into the infrared (well, near-infrared) portion of the spectrum, Public Lab hopes to let users see how well plants are converting light into oxygen. The end result is a pair of images that, when processed properly, yield a single false-color image that shows off which plants (or parts of plants) are reflecting the most near-infrared light and are therefore absorbing the most red and blue light.
In a bid to get as many people seeing plants in a different light as possible, the most rewarding tier will see backers at the $10 level receive a “superblue” filter that attaches onto existing digital cameras (here’s a list of cameras that seem to work well with the filter).
A contribution of $35 nets you the most basic hardware component of the bunch — a cheap webcam that works just as well when lashed to a Raspberry Pi as it does when hooked up to your laptop. $95 nets you something really interesting: a bespoke point-and-shoot 2-megapixel camera that already has one of those “superblue” sensors nestled inside it. Once backers start snapping photos of the local greenery, they’ll be able to upload them to a work-in-progress web service to get those false color images. The team is also working on a spate of analytical tools to cull more information from those images, so the curious nature nut can gain even more insight on the flora around them.
The Public Lab team is no stranger to these sorts of crowdfunded science projects — last year they successfully raised $110,000 for a homebrew spectometry kit that rather smartly relied on a shard of a DVD-R disc. This new project has only been live for five days, but a slew of enthusiastic backers has already brought the team within spitting distance of its $30,000 funding goal. With a month and a half left to go Public Labs is on track to have yet another crowdfunded scientific success on its hands — here’s hoping that some of those backers will put those Infragram camera in youngsters’ hands. After all, we could probably do with a new generation of young people that are sensitive to the plight of those poor plants.
Heath Robinson RPi wifi radio! Has, so far : 3 buttons, 2 dials, a 4 digit 7 seg display clock (controlled by MAX7219), a 128 x 64 LCD (ST7565), wifi dongle, cheap USB ‘sound card’, PWM analogue meter showing download rate. Needs 1x box.
Runs MPD for the audio. Code updates clock, LCD etc. calls MPC with various BBC playlists
Not only can the Raspberry Pi camera capture photos but it can also capture full HD video at 1920×1080. The official camera module has been optimised to use the full hardware media capabilities of the Pi’s processor which allows it to handle video that a standard 700MHz would struggle to process.
The idea is very simple: we press a button and send a request to our website which then increases The Sandwich/Coffee Count. The design isn’t much harder than that – thanks to RaspberryPi we can build a very basic circuit (power -> button -> line out to RPi -> LED -> resistor -> ground, it doesn’t get much simpler than that) and then just “catch” the signal and voila! RPi sees the signal and fires out a POST request.
Here is a quick test of the collision detection using a Xtion Pro Live (Prime Sense) with a Raspberry Pi as the brain. To note the Rpi is overclocked to 1000Mhz. (You will notice a slight click in the middle of me turning on the collision interrupt call. I forgot I turned it off in the beginning since it was so close to the camera. I realized it after it wasn’t responding to the cone in front of it.) Collision is only detected in the middle of it’s vision. This is due to the legs sometimes crossing over into it’s field of vision during full body rotations. I have also integrated the three vision options into a “heads up display” using OpenNI and OpenCV.
Speech is handled via a espeak library my friend Kurt created. The gait algorithm is a complete rewrite in order to give as much stability to the camera. (Plus I just wanted to see if I could do it…) The USB hub is a de-cased powered 4 port which gets it power from a BEC that takes the 3-cell lipo (11 volt) and drops it down to 5.1 volts which in turn powers the Raspberry Pi, Xtion and the amplified speaker. All code is done in c++.
In this post I will explain how I made some timelapse videos using the Raspberry Pi camera module. This is a two step process which involves getting the camera module to take a series of stills over a period of time and then combining them into an MP4 video file.