Call it the internet of bodies, call it emotionally intelligent wearable tech. Designers, engineers and artists want to wake the mainstream tech giants up to the realities of asking people to wear technology.
Read this: How wearable tech infiltrated the traditional Baselworld show
It's simple to define what they don't want - tech companies launching devices for the sake of it. The challenge is to figure out what we do want from our relationship with wearables.
Here's a few things that happen when we wear something. We move around. We encounter strangers, social situations, the elements. We project an image to the outside world and we register how other people perceive us.
So how does technology fit in to all of that?
Francesca Rosella is creative director of CuteCircuit, the tech-inclined fashion house known for its telepresence Hug Shirts, Twitter dresses and alien spacesuits for Katy Perry. She sees a problem with the current wave of me-too wearable tech.
"People forget about emotion," she said. "When you put something on your body, you are accepting it as part of your identity. You still have to make a choice: 'What am I going to say about myself today?' That is choosing who you are."
The Hug Shirt, launched in 2006 and updated in February 2014, allows users to squeeze their own garment to send a hug to a loved one over long distances. Now CuteCircuit is focusing on self expression - all their products have built in Twitter connectivity now so that you can display Twitter feeds from particular usernames, keywords or hashtags.
The smart clothing itself ranges from £100 T-shirts to £1,800 French silk jackets and £5,000 dresses. Customers can also select patterns (yes, of LEDs) according to fashion forward cities or design their own via CuteCircuit's "iTunes for fashion" database with Accenture.
"In 5-10 years, clothing will be more about self expression but also about communication and sharing about emotion," said Rosella.
"Which data is important? Which data do you want to broadcast? I can see people willingly adopt air quality sensors if it contributes to healing the environment. But there are lots of sensors that people won't adopt because they just won't see it as their thing."
In 2012, 9,000 people took part in a telepresence installation across four cities - London (at the National Theatre), Paris, Istanbul and Brussels.
Titled Me and My Shadow, the virtual world experience with numerous portals was the work of the NT, artist Joseph Hyde and design collective Body>Data>Space and built on decades of work in this space.
Ghislaine Boddington, founder of Body>Data>Space, has been exploring telepresence in all its forms - robotics, avatars, wearables, motion and gesture tech - since the early 1990s.
"This is about placing the body at the centre of digital interaction," she said. "Our bodies are natural, we move well as we are. Of course there are wearables that work but for instance, Google Glass is not the most social one."
Boddington gives plenty of examples of connected tech that she approves of - all have specific uses and do not require the wearer to change their physical behaviour to get the tech to work.
Imogen Heap's music-manipulating gloves, the Myo muscle-sensing gesture armband, Microsoft's Kinect and the Leap Motion tracker.
"We can't take a classical product and expect it to work," Boddington added. "We need to make connections between people and objects to create a hypersensory self and a kind of hyperliveness.
"It's a new ecosystem, a complex choreography of the essence of liveness, of experience, breathing, physicality, our memory, presence and absence."
Getting touchy feely
Anyone who has worn a VR headset or a pair of AR glasses will know that reaching out and touching or gesturing is one thing but right now all we feel is thin air.
Bristol-based Ultrahaptics is using ultrasound waves to change that. Sriram Subramanian, its co-founder, has been working on this technology since 2013 and says it is now at the stage where Ultrahaptics has an evaluation programme and is in discussions with tech companies.
Read this: Crazy VR experiments from CCP Games
It uses an ultrasound array of up to 200 speakers to allow you to "feel without touching" a distinct point, similar to the vibration you get when you tap away on your smartphone. It's very accurate too - our touch is precise up to 2cm, Ultrahaptics has a precision up to 8.5mm.
"We can target each fingertip individually and create different textures," said Subramanian. "We can even levitate objects. A feeling of touch in interfaces is essential if you want to move quickly and efficiently. We can generate up to 50 sensations at one point, you could feel a circle in your palm or a sphere or a solid object. You get an idea of the shape."
With a minimum distance of 4cm and a maximum of 2m, Ultrahaptics could be used to bring our sense of touch to gesture controls in cars and to bring tactile sensations to gaming with Kinect, and eventually, VR headsets like Oculus Rift.
It's not difficult to think of wearable tech which could benefit from investigating self-expression, gesture and haptic touch.
We might not all be broadcasting our tweets from our handbags in five years time but we sure as hell won't be wearing devices which ignore the fact that we each have a real, living body and unique tastes in how we dress ourselves.
Francesca Rosella, Ghislaine Boddington and Sriram Subramanian were speaking at Re.Work's Internet of Things Summit in London.
How we test