Upcoming Technology of User Interface Top 8 Are Almost Here
When we talk about the user interface (UI) in the calculation, we refer to the Upcoming Technology of computer program or system expresses its users through graphics, text, and sound. We are familiar with the typical Windows and Apple operating systems, we interact with the icons on the desktop with the mouse cursor.
The change from text to graphics was one of Apple's founders Steve Jobs. In 1984, his iconic Macintosh operating system launched a huge leap. In recent years, we have also witnessed the innovative UI, involving the use of touch, sounds, and even gestures. However, they are almost in the early stages of development.
However, they gave us a clue about how the upcoming technology of the UI was. The following are the eight key functions that the upcoming technology UI might do:
1. Gesture Interface:
2002 Science Fiction Film The "Minority Report" the interaction with the computer system mainly through the use of gestures in the upcoming technology. Wearing a pair of futuristic gloves, lead Tom Cruise by hand to manipulate his computer system images, video, data sheet.
Ten years ago, the user interface with such seamless detection of space movement seemed a bit far-fetched. Today, with the advent of motion sensing devices such as the Wii Remote in 2006, Kinect and PlayStation Move in 2010, the upcoming technology of the user interface may only move in this direction.
In gesture recognition, the input is performed in the form of a hand or any other body motion to perform a computational task, which is still passed through a device, a touch screen, or a voice input. Adding the z-axis to the existing 2D UI will undoubtedly improve the experience of human-computer interaction. Imagine how many functions can be mapped to our body movements.
"The Americans have need of the telephone, but we do not. We have plenty of messenger boys." — William Preece, British Post Office.
Then there is a g-speak demo video, which is the prototype of the computer interface that appears in the "Minority Report". designed by John Underkoffler, who is actually a science consultant for the film. Watch how he can browse thousands of photos on his 3D plane through his gestures and work with the "gesture" in the team's quest. Excited? Underkoffler believes that such a UI will be commercially available in the next five years.
2. Brain Interface:
Our minds use our ideas to produce a variety of electrical signals, so each specific idea has its own brainwave pattern. These unique electrical signals can be mapped to execute a specific command so that the thought can actually execute the setup command.
In the EPOC neural head created by Tan Le, co-founder, and president of Emotiv Lifesciences, users must use futuristic headphones to detect brain waves from their ideas.
Fully Developed UI:
It can be seen that the commands executed by the idea are very primitive (ie, the cube is pulled out to the user), but the detection seems to face some difficulties. It seems that the UI may take some time to fully develop.
'Technology can be our best friend, and technology can also be the biggest party pooper of our lives. It interrupts our own story, interrupts our ability to have a thought or a daydream. imagine something wonderful, because we're too busy bridging the walk from the cafeteria back to the office on the cell phone.'
In any case, imagine a (distant) upcoming technology, people can use the idea to operate the computer system alone. From the concept of "smart home" that allows you to turn on or off the lights without leaving your bed in the morning, think of immersing yourself in an ultimate gaming experience, responding to your mood (through brain waves), such an awesome UI The potential is virtually unlimited.
3. Flexible OLED Display:
If the touchscreen on the smartphone is rigid and still does not respond adequately to your command, then you may first try using a flexible OLED display (Organic Light Emitting Diode). OLEDs are organic semiconductors that can display light even when rolled or stretched. Stick it on a plastic flexible substrate, you have a brand new and not so stiff smartphone screen.
In addition, these new screens can be twisted, bent, or folded to interact with the internal computing system. Bend the phone to zoom in and out. Rotate an angle to raise the volume. Twist the other corner to turn it down. Twist the two sides to scroll the photo, and so on.
This flexible UI allows us to naturally interact with the smartphone, even if our hands are too focused on using the touchscreen. This may be the answer to the sensitivity (or lack) of the smartphone screen to wear gloves or the answer that the finger is too large to reach the correct button. Use this UI, However, you need to do is squeeze the phone with the palm of your hand to answer the phone.
4. Augmented Reality (AR):
We have encountered AR on some of our smartphone apps (such as dimensions and Droid shooting). They are almost in the basic development phase. AR through the upcoming Google Project Glass to maximize the upgrade.
1995: "I predict the Internet will soon go spectacularly supernova and in 1996 catastrophically collapse." — Robert Metcalfe, founder of 3Com.
As long as the device can interact with the real world environment in real time. AR can be anywhere outside of the glasses. Draw a transparent device. Where you can keep useful information in objects, buildings, and surroundings. For example, when you meet a foreign sign. You can view them through a glass device so that you can read it easily.
Mobile User Interface:
5. Voice User Interface (VUI):
Speech Recognition has not yet been a revolutionary success since Chris Schmandt's "Put That There" video presentation in 1979. The upcoming technology Voice User Interface (VUI) speculation must be Siri, which is a personal assistant application that merges into Apple iOS. Because It uses the natural language user interface for voice recognition, dedicated to the task on the Apple device.
However, you can also consider upcoming technology as a supportive behavior for other user interface technologies such as Google Glass. The glass works like a smartphone, only you do not have to hold it, interact with it with your fingers. Instead, it depends on you as a spectator and receives your commands through voice control.
The only thing missing in VUI is to identify what you call reliability. The functionality of smartphones is expanding and evolving. and now it's just a matter of time. Accordingly, VUI will be the main form of human-computer interaction for any computer system.
6. Tangible User Interface (TUI):
Upcoming technology having a computer system that focuses on the physical environment with the digital domain to identify real-world objects. In Microsoft Pixelsense (formerly known as Surface), the interactive computing surface can identify and identify objects placed on the screen.
In Microsoft Surface 1.0, the light from the object is reflected multiple infrared cameras. Thus this allows the system to capture and respond to items placed on the screen. So that this allows the system to capture and respond to items placed on the screen.
In the advanced version of the technology (Samsung SUR40 and Microsoft PixelSense), the screen includes a sensor, rather than a camera to detect the contents of the touchscreen. Therefore on this surface, you can create a digital drawing with a brush based on the actual brush's input.
The system is also programmed to identify dimensions and shapes and interact with embedded tags. The business card placed on the screen will display the card information. So the smartphone placed on the surface can trigger the system to seamlessly display the images
7. Wearable Computer:
The wearable computer is an electronic device for upcoming technology, and you can wear your computer like your accessories or clothing. It can be a pair of gloves, glasses, watches and even suits. The key feature is that it should keep hands free and will not hinder your daily activities. In other words, when you want to access it, it will act as your secondary activity.
Think it has a watch that can work like a smartphone. Sony has already released an Android smartwatch earlier this year. That can be paired with an Android phone via Bluetooth. So it can provide new e-mail and tweets for notifications. Moreover, with all smartphones, you can download compatible applications to Sony SmartWatch for easy operation.
It is expected that there will be more wearable UI in the near future. Microchips with intelligent functions can increase nano-size and are suitable for everyday wear.
8. Sensor Network User Interface (SNUI):
Here is an example of upcoming technology a smooth UI where you can use a variety of compact patches consisting of a color LCD screen, built-in accelerometer and IrDA infrared transceiver, which can interact with each other in close proximity is Sensor Network User Interface (SNUI).
Users can physically interact with tiles by tilting, be shaking, lifting, and colliding with other similar titles. These tiles can be used as a highly interactive learning tool for children who can react immediately to their actions.
"In the early 2000s, people expected that anonymity on the Internet would be positive for the development of democracy in South Korea. In a Confucian culture like South Korea's, a hierarchy can block the free exchange of opinions in face-to-face situations. The web offered a way around that." Kim Young-ha.
The SNUI is also great for simple puzzle games, which include moving and rotating tiles to win. Then, you can also group these tiles together according to your preferences to physically classify the images. Hence This is a more user-friendly TUI.
The Highest Expected UI:
As these user interfaces become more intuitive and natural for the upcoming technology of users interface, we will be treated with a more immersive computing experience, which will continually test our ability to digest the level of knowledge they must share. It would be overwhelming, and sometimes exciting, it is definitely looking forward to the advancement of new technology.