There’s a great technology based around giving people a 6th sense.
Currently it’s basically a vibrating smart watch or fitbit like device that’s connected via Bluetooth.
The idea is that you can FEEL when something is happening.
My thought was to connect it to the Flight Radar 24 APIs so I’d know how close by an aircraft (or esp a helicopter) is. e.g Both distance and altitude to the nearest flying machine from my location. This would be great when flying a drone, I could feel when to quickly stop and return it.
There’s another piece of tech where you can have someone that’s blind be able to see by taking the output of a camera, e.g on their glasses and have a 10×10 grid of actuators for touching a section on their back.
They then learn to SEE using the output from the grid. They can duck around branches, etc..
One option is to have a PIP Boy like device on someone’s arm which has a 6×3 (so only 18 touch points) grid on their wrist.
You would have 3 lines that are 3 points long which can be for your suit or the space ship readings.
E.g in your suit you have a reading for how empty the O2 is, the Battery level and the heat.
Thus as you start out those points don’t do much, but as they get more and more depleted you feel the different points more and more. As it gets to dangerous levels they start to pulse or even get hot.
You’d also have a 3×3 grid which could be used for navigation. The center point being for where you are and when you turn around you’d feel a point move for where your destination is.
You could cycle through different destination or nav points, like where’s the Rover, where’s the main base, where’s the mining post you are heading to, etc.. Although I’m guessing that would be done on the arm band (when you are out in Mars) or on your phone if you are in a Rover or not in a suit.
You’d also still have access to those readings via a heads up display or whatever else, but you’d be able to physically feel them. They’d be a 6th+ sense.
Something like it could be used for computer gaming. Your mana and health and where the enemies are in something like DOTA / League of Legends.
Or your Metal / Energy production, miners, units under attack, size of the enemy and other metrics in a game like Planetary Annihilation, Supreme Commander, Command And Conquer, etc…
Or when you are living on the Mars base then how much your water, food, O2, electricity and heating resources you are using.
You’d feel how much energy the water boiling takes up.
The physical touch devices could have sophisticated tips which can change shape and texture. Round and soft for more vague things. Sharp and pointed for something important and urgent.
Attributes you could change:
- Amount of pressure: soft, medium, hard
- Motion: pulsing, intermittent pulsing, specially timed sequences, wave motion, etc..
- Texture: Rough, sharp or smooth
- Temperature: Warm or (maybe) cool
You can have more than just the grid I suggested.
One issue is that it’ll take a while to get used to what the sensations mean. You have to learn it.
You might also spend some time learning them in a specific location and then if you change context, like from playing a computer game to flying a space ship, then the sensations will have different meanings but your brain will have to relearn them.
Or simply moving the grid to a new patch on your body would cause you have to relearn.
I don’t know if this will be like going to the cinemas and still feeling parts of that alternate reality for a while after you come out.
Will it only take you a few minutes to adjust. Or will it take days, maybe even weeks?
Side note: I came across an interesting point about how the brain will start to reuse sections of the brain that haven’t been used for over an hour and a half or so (which is a primary reason we dream).
One option is to simply put the patch on a different part of your body.
One wrist for gaming, the other for flying. Will that help?
Maybe you don’t switch contexts, you just add new lines of signal? More grid points?
Will this work? No idea.
But to feel your O2 supply when in outer space or diving would be pretty epic.
A completely different thing compared to the Augmented Reality / VR that Meta and Apple are working on.
But maybe they would work well together?