Welcome to PIC-TALK Documentation
For full introduction visit (https://pic-talk.org).
The project is not currently under development. Please contact me if you want more information. email@example.com
This documentation refers to the latest development versions of PIC-TALK.
We're trying to develop open source products for visually impaired persons. Also, everyone can involve this project as a volunteer. Some videos about project.
Now, we can understand how our system works. We have 4 main topic. These are:
- User Interface
- Haptic Display
- Mobile App
The system works on 4 basic subjects as given above. Each of these systems has the ability to work separately and together. Thanks to this feature, a technology model that can work independently and together can give the most efficient result. This suggests an ecosystem design that was not available before. All of these systems are open source is considered to be an important cornerstone for the development and interaction of the system.
One of the most important features to be identified in the picture and daily life are the colors. A special glove and wristband design will be made in order to feel the colors. Thanks to the vibration motors to be included in the glove design, the colors in the paintings will be made sensible for the visually impaired individuals. In addition to feeling the colors, it will be made available for many different functions. These gloves will be able to rearrange them according to the intended purpose.
- Color Detection
- Edge Detection
With this product, it is aimed to make an alternative addition to Braille tablets and Taylor's cases, which visually impaired individuals use widely but do not get efficiency. This plug-in can also be controlled via a PC via Bluetooth and an edible haptic display is created. This display is in the form of a 8 * 8 matrix. In other words, a renewable haptic display will be created with 64 different points. By this means, individuals with visual impairments in a digital environment will be able to feel a picture physically.
Computer & Mobile Applications
The application consists of two main parts in the interface itself. The first is the section for visually impaired users. The second part is the section where the volunteers who are not visually impaired but who will contribute to the development of the system will enter. First of all, the system will be prepared with an image reading interface and an instant image of the picture. In addition to this interpretation, the sound files of the object will be added to the objects detected in the picture. Thanks to this, a picture of the system will be interpreted and made possible by the visually impaired person. Clicking on the visually impaired screen will play the audio files of the object it is on, and an attempt will be made to describe the optimum picture. In addition to this, volunteer individuals will be able to record picture descriptions in both voice and text format. The visually impaired individuals of these recorded files will be able to search the internet and listen to picture descriptions easily.