Trusted

Developers: A New Program is a Short Cut for Building the Metaverse

3 mins
Updated by Nanok Bie
Join our Trading Community on Telegram

In Brief

  • A new software program has been made for developers building the metaverse
  • Using the program, developers can skip a few steps
  • Machine learning can take place once the program has been added
  • promo

Developers building the metaverse might be able to skip a few steps with a new program called EyeSyn.

EyeSyn is a virtual platform that replicates how human eyes track stimuli. Such stimuli can be anything from conversations to perusing art galleries. Our eyes are the windows to our soul, after all.

In the Metaverse, it is really important for developers to know where our eyes fixate the most. The tiny movements of our eyes and how often our pupils dilate provide a surprising amount of information. Human eyes can reveal if we’re bored or interested, or where our concentration is focused. Eye movements can reveal whether we are an expert or a novice at a given task. It can even show that we’re fluent in a specific language.

Where you are prioritizing your vision says a lot about your beliefs too. It can accidentally reveal interests that we don’t want others to know about, and things that we may not even know about ourselves.

At the moment, developers are building the metaverse. They need to know how we are using our eye movements. Once they know this, then they can set up machine learning. As data comes in about what people are looking at, they can teach their programs to adapt and cater to what people are focusing on.

Not only can developers tailor content to engagement responses. They can also do things like reduce resolution in peripheral vision to save computational power.

All of this takes time and research. But what if a company has already studied what humans look at? And what if that company wrote the software, ready for machine learning, so all developers had to do was insert the program into their metaverse? The metaverse could start machine learning right away.

Developers and EyeSyn

Enter computer engineers at Duke University. They have developed a set of ‘virtual eyes’ that simulate normal eye movement well enough to train new metaverse applications. Called EyeSyn, the program will give developers a shortcut when building the rapidly expanding metaverse.

Maria Gorlatova is a Professor of Electrical and Computer Engineering at Duke. “If you’re interested in detecting whether a person is reading a comic book or advanced literature by looking at their eyes alone, you can do that. But training that kind of algorithm requires data from hundreds of people wearing headsets for hours at a time. We wanted to [allow] smaller companies who don’t have those levels of resources to get into the metaverse game.”

developers  cryptography privacy

Mimicking humans

When one human is watching another human talk, their eyes alternate between the person’s eyes, nose and mouth for various amounts of time. The EyeSyn researchers created a model that extracts where those features are on a speaker. And then, programmed virtual eyes to statistically copy the time spent focusing on each region.

Gorlatova said, “If you give EyeSyn a lot of different inputs and run it enough times, you’ll create a data set of synthetic eye movements that is large enough to train a (machine learning) classifier for a new program.”

The researchers tested the accuracy of their synthetic eyes. They compared a virtual dataset of their synthetic eyes looking at art with actual datasets collected from people browsing a virtual art museum. The results showed that EyeSyn was able to closely match the distinct patterns of actual gaze signals. EyeSyn could simulate the different ways different people’s eyes react.

Gorlatova says that this level of performance is good enough for companies to use it as a baseline to train new metaverse software. With this basic level of competency, commercial software can get even better results by personalizing the algorithms after interacting with its users.

“The synthetic data alone isn’t perfect, but it’s a good starting point. Smaller companies can use it rather than spending time and money trying to build their own real-world datasets with human subjects.”

Developers, it’s a brave new world out there. We will watch this space.

Got something to say about developers building the metaverse or anything else? Write to us or join the discussion in our Telegram channel.

Top crypto projects in the US | November 2024
Coinbase Coinbase Explore
Coinrule Coinrule Explore
Uphold Uphold Explore
3Commas 3Commas Explore
Chain GPT Chain GPT Explore
Top crypto projects in the US | November 2024
Coinbase Coinbase Explore
Coinrule Coinrule Explore
Uphold Uphold Explore
3Commas 3Commas Explore
Chain GPT Chain GPT Explore
Top crypto projects in the US | November 2024

Disclaimer

Following the Trust Project guidelines, this feature article presents opinions and perspectives from industry experts or individuals. BeInCrypto is dedicated to transparent reporting, but the views expressed in this article do not necessarily reflect those of BeInCrypto or its staff. Readers should verify information independently and consult with a professional before making decisions based on this content. Please note that our Terms and ConditionsPrivacy Policy, and Disclaimers have been updated.

Nicole-Buckler.png
Nicole Buckler
Nicole Buckler has been working as an editor and journalist for over 25 years, writing from Sydney, Melbourne, Taipei, London, and Dublin. She now writes from the Gold Coast in Australia. Nicole first bought Bitcoin in 2013 not really understanding what she was doing. She is an accidental HoDLer. Got a story on the culture side of crypto? Email [email protected]
READ FULL BIO
Sponsored
Sponsored