25C3: Power line communication
[Florian] and [Xavier Carcelle] started the day at 25C3 by covering power line communication. PLC technology is not widespread in the US, but has gained popularity in countries like France where it’s included in set-top boxes. PLC lets you create a local network using the AC wires in your wall. The team started exploring PLC because despite being newer technology, it had a few principles that made it similar to old networks. There’s no segmentation in the wiring, which means it behaves like a layer 2 hub. You get to see all of the traffic unlike a switched network. Most power meters don’t filter out the signal, so it’s possible that you might see your next-door neighbor’s traffic on your line. [Florian] reports having seen all the traffic in a six-story building just by plugging in. The wiring also acts as a large antenna so you could employ tempest attacks.
The technology involved is certainly interesting, but they found a lack of tools to work with it. They wrote FAIFA to fill this gap. It’s currently a command line tool for probing and configuring Intellon-based PLC devices (Intellon is the majority chip supplier for PLC). You can query devices and it even has a sniffer mode. Sniffing may not seem interesting since devices that support the HomePlug AV standard use encryption, but they’re all shipping from the factory with the same default key. In the future, they hope to build their own open source FPGA based PLC device to take even more control of the system.
25C3: State of the art wearable computing
[Kai Kunze] from the Embedded Systems Lab at Passau came to 25C3 to talk about Cyborgs and Gargoyles: State of the Art in Wearable Computing. There have been a lot of homebrew wearable computing solutions, but [Kai] covered specifically projects that could see everyday use in the real world.
The first was a prototype system they built for use in hospitals. The doctor wore a belt buckle sized linux computer under his coat which was attached to an RFID reader on his wrist. He would read the patients RFID wrist band, which would display their chart on the screen. He could then scroll and select using a capacitive sensor built into the coat. Notes could be taken using a bluetooth headset. The system kept the doctor’s hands free for examining the patient while still providing as much information as possible. They actually ran this system for 30 days in a hospital.
The next example was a joint project with the car manufacturer Skoda. Quality assurance (QA) testing can be a long process with many more steps than assembly operations. The team attached sensors to the worker to determine where the worker was in relation to the car and to get direct measurement of the object being tested. The use of wearable technology meant they got more data than they normally would with standard QA testing and they could quickly prompt the worker if they missed a step.
[Kai] identified a couple projects that would make developing your own system much quicker. Context Recognition Network Toolbox helps you identify what actions are being performed. They’ve used it to build systems like an automated kung-fu trainer that can recognize poses. There’s also a context logger app for the iPhone that can be trained using accelerometer data to recognize different activities. He also suggested a program developed with Zeiss for visually prompting workers as they performed tasks. In testing, it was 50% faster than text instructions and 30% faster than voice.
One of the more bizarre/interesting ideas we saw was a phone locator based on resonance (PDF). Designed for a Symbian device, it would play a sound and then record the result that had been modified by the surroundings. Each surface had its own signature so you could query the phone and it would report where it was i.e. on the desk, on the sofa, in the drawer. This resonance sampling can also be employed using the vibration motor.
The final point [Kai] touched on was privacy. If you’re wearing a sensor, you’re potentially giving away personal data. He showed an example of how systems could be designed to keep this information to users. The first part was a camera recording the movement of people in a room. It could identify where the faces were, but not who they were. One of the participants had an accelerometer recording their movements. That user could use the camera’s data to figure out his own movement in the space by correlating the data, but no one else would see the full picture.
You received this email because you are subscribed to the real_time feed for http://hackaday.com/feed/. To change your subscription settings, please log into RSSFWD.
No comments:
Post a Comment