Pepper

Wearable Augmented Reality computer and software applications, with EEG and extended sensors input.

Applications

Look up people around you on social network based on facial recognition.

Mood based news, music, video recommendation using brainwave (EEG) processing.

Mind control games and switch.

Real-time search for object around you and display information on head-up display.

Use ring, gesture control and pupil tracking as inputs.

Record and stream video to web and virtual reality (VR) applications.

Open platform and API for developers to build software and connect to other devices.

Online community for users to interact & collaborate on projects.

Hardware Modules

Microprocessor Running Linux

Running Linux operating System (Ubuntu, OpenWRT, etc) on microprocessor, developing open source Linux native applications, kernel modules and device drivers.

Wifi + Bluetooth

Wifi on board enable easy and fast web surfer, bluetooth low energy enable to pair other devices including external motion or touch sensors to Pepper board.

Electroencephalograpm(EEG) Sensors

Dry EEG sensors integrated on headset, collecting brain waves, running machine learning based learning brain controlled applications, streaming the data to cloud based database. Plug in more sensors if need, active on sensor amplifiers extension.

Head-up Display

Using CLD displays, reflective glass surfaces to create a "hologram" display in front of your eye on glass, composed of different views from the left and right eye.

Infrared Light Sensor + Accelerometers

Using Infared light sensors, accelerometers, machine learning and computer visual hacks to sense, track and map the gestures to command, and detecting other moving objects.

RGB Cameras

Using cameras facing in and outside to recording videos and making inputs for facial, pupil, object detection and command mapping.

Software Modules

Native Linux Machine Learning Module

Using EEG, accelerometers, browser, and other data to build a Linux based machine learning software module for recommendation and system managements based on users' perference and habits.

Computer Visual Based Facial and Object Recognition

Using Computer visual hacks to do real-time object and facial recognition to pull up profile information and object wiki based on users' environments and needs.

Machine Learning Based Mood Recognition on EEG

Brainwave based mood recognition, using supervised learning to tracking if user like something or doesn't like something, and tracking people's level of focus and meditation. Using outputs of classifiers for real-time computer control.

Video Streaming to Virtual Reality(VR)

Supporting video streaming and displaying to virtual reality applications using web based VR.

Web Interface for User Networks and Online EEG Trainning

Building a website for users to registers, interacting online and setting up profile. Do online training and streaming EEG data by listening to music, watching videos and playing in-browsers games with EEG headsets on.

Interface Connecting to Other IoT Devices

Software interfaces to connecting and streaming data from others devices and hardwares, including Emotiv, OpenBCI, NeuroSky, Muse, Arduino, Iphone and Android Phone.

Path

  • 2014.9

    Started Hacking on EEG Brain-Computer Interface Using OpenBCI

    A group of hackers and hobbyist from Cognitive Technology Group at Berkeley, starting learning more about technical side of brain-computer interface and developing data preprocessing, feature selection using a OpenBCI v2 headset.

  • 2014.11

    Machine Learning & Cloud-based Software using Multiple EEG Devices

    Started building mood classifiers, concentration/meditation calibration Developed machine learning training models, including Support Vector Machine, K-Nearest Neighbors, Artificial Neural Networks, Recurrent Neural Networks, etc. Built cloud based connetors to streaming EEG data to cloud, wrote connectors for more EEG heasets including Emotiv Epoch, OpenBCI, NeuroSky.

  • 2014.12

    Developed Capacitive EEG Sensor & Amplifier Circuit

    Collaborated with ECE research lab in University of Illinois at Urbana-Champaign to build dry and capactive EEG sensor which doesn't need conducting gel. Designed and built amplifier circuit to amplify the signal with a on sensor gain of 100, to increase signal to noise ratio.

  • 2015.1

    Played with AR/VR & Electronics

    Building brain controlled VR games using Muse Headset, Unity 3D and Oculus Rift, exhibited at San Francisco Science Exploratorium from 1.29 to 3.1. Built web VR application using WebGL and three.js, wrote pupil & facial recognition. Developed with more hardware including embedded Linux, FPGA, Microcontroller boards, ARM, ultrasound and infrared light sensors, etc.

  • Today

    Making AR + EEG Device!

Github Link

A little piece of code snippets, more uploaded coming soon.

Visit Resource on Github

Contact Us

Email: team@peppers.io, Phone: 330-998-8159, Address: 2020 Kittredge St., Berkeley, CA, 94704