Air Islamic Android Phones
Code name: asio
Owner Mohamed Hussein Mustafa ~ Software Engineer ` Cairo, Egypt
"Islamic Android Phones" Build for publishing islamic religious for more distrebution in islamic countries helping in growth of islamic religion is rapidly NW-Hi-Tech.
Iam making a two wheel self-balancing robot that will balance it self on two wheels through different sensors. My central and main controller will be the PANDA BOARD ES and further the robot will be controlled through Wi-fi interface. and their will also be some sensors for the detection of obstacle.
These are the main things which i am doing by using PANDA BOARD ES
how do i get chrome os onto an sd card so that I can boot that on my panda board? I need help asap! Thanks!!!
I am building a wearable computer,that will be optimized for use in an educational,especially high school,enviroment.It will serve to augment the student's memory,with a camera and a mike,as well as the student's knowledge,through encyclopedias an the internet.It will include a head mounted display,easy to use keyboard an possibly internet access.
Das U-Boot (Universal Bootloader) is an open source, primary boot loader used in embedded devices. It is available for a number of different computer architectures, including PPC, ARM, MIPS, AVR32, x86, 68k, Nios, and MicroBlaze.
The PandaBoard is a low-power, low-cost single-board computer based on the Texas Instruments OMAP4430 processor. It is a community supported development platform and is the first OMAP4 mobile software development platform.
The PandaBoard supports both DVI(Digital–Visual Interface) as well as HDMI(High-Definition Multimedia Interface).
U-boot currently, does not initialze the display until the kernel is up and running. The project aims to implement a splash screen for U-boot using the PandaBoard so that the display does not remain blank until the kernel initializes it.
Small board is connected to two LCD headers on the bottom of PandaBoard, and provides simple interface to LVDS LCD panels like those you can find in notebooks. Board supports EDID/I2C interface for automatic display settings setup. Also, there is support for LCD brightness, including automatic brightness control with help of ambient light sensor.
Schematic and Gerbers are available under GPL license.
Also, you can order bare PCB, fully assembled board, or bundle of board with LCD with capacitive touchscreen.
Our Project name is viSparsh.This word has been coined by our team. It is a conglomeration of two words namely vision (Vis) & Touch (Sparsh). We felt the name ViSparsh truly depicted the purpose of our project which was to aid the visually impaired people through the sense of Touch. We also thought this as an apt name for our team too whereby the same word would signify our team's aim of touching the lives of several people through our vision.
The aim of the project is to develop a Haptic Belt with the XBOX360 Kinect for visually impaired persons where the initial technical effort needs to be taken to develop some relatively low-cost technology projects and integrating them into a socially useful application. The project would be one of its kind in India considering the fact where plethora of technological products are already present, only a few actually creates an impact needed for the Indian Society. We are aiming to build a state of the art, easy to use customized for the Indian Market.
Final product will be not only be a navigating device utilizing Kinect, GPS, Digital Compass but also an entertainment and utility product.
Gesture Recognition In Android
Gestures are a powerful means of communication among humans. In fact, gesturing is so deeply rooted in our communication that people often continue gesturing when speaking on the telephone. Hand gestures provide a separate complementary modality to speech for expressing ones idea. Information associated with hand gestures in a conversation is degree, discourse structure, spatial and temporal structure. So, a natural interaction between humans and computing devices can be achieved by using hand gestures for communication between them.
The purpose of this project is to provide a highly sophisticated Human Machine Interface.
The camera of the computing device is opened simultaneously with a photo viewer application. Appropriate gestures are made. These gestures are captured by the camera and the type of the gestures and determined by certain algorithms and these gestures are converted into computer understandable commands. These commands are mapped to an application where the intended actions are performed.
This new gesture based approach allows the users to interact with computers through hand postures, being the system adaptable to different light conditions and background. Its efficiency makes it suitable for real-time applications.
Gesturing can be used by developers as a tool for development of a wide range of applications and by typical users who use smart phones and tablets that run Android. People who are physically handicapped will also find this system very useful.
The application user performs gestures using hand. A gesture recognition system uses a video camera to capture images of the hand movement. It captures the live stream and extracts that into frames. The gesture-recognition software tracks the moving hand features, it identifies the motion and sends it to the android application. The android application then issues commands to the currently running application.
An evaluation kit with OMAP 4430 processor (PandaBoard).
A motion sensing camera.
Ram: 120MB or more.
Hard disk: Minimum 200MB.
Android SDK 2.0 or more.
JAVA and XML.
The system is required to perform the following functions.
Switch on the camera and open an application simultaneously.
The camera should be on in the video capturing mode and should run in the background as the intended application should remain in display.
Capture the gestures made by the user of the device by the motion sensing camera present.
Perform corresponding actions for the appropriate gestures made by the users
Dalvik virtual machine optimized for android devices.
Rich development environment including device emulators, tools for debugging, memory and performance profiling, and a plug –in for the Eclipse IDE.
The system is expected to run on low memory devices also.
The response time should be very less. i.e., a response action should be performed as and when the gestures are made.
The system should neglect the inappropriate gestures made by the user.
Availability of the system depends on availability of the device and its service.
Documentation provided by the application is simple and easily can be understood.
Platform compatibility is limited to android devices.
The product build is scalable.
Usability by the target user community is given utmost importance.
BY PES School of Engineering,
System on Module Base OMAP4430,and a Clone to PandaBoard .with Size of 40mm*40mm*2.5mm
8GBytes eMMC Flash
5.10/100 Ethnernet (SPI interface)
picoFlamingo is a portable presentation solution initially developed for the BeagleBoard and picoDLP projector, but it can be executed in any OpenGL ES 2.0 compliant system. Slides can contain text, images, live video streams, and 3D objects that can be animated in a 3D space and dynamically updated to produce advanced user interfaces. When used in combination with NetKitty, picoFlamingo can be controlled remotely through any Bluetooth or network enabled device. Simple remote control tools for Symbian S60, OpenMoko, and Android 1.5 are included. A set of small applications for video streams and voice commanding are also included.