dopetalk does not endorse any advertised product nor does it accept any liability for it's use or misuse

This website has run out of funding so feel free to contribute if you can afford it (see footer)

Author Topic: A computer that understands how you feel  (Read 4735 times)

Offline Chip (OP)

  • Server Admin
  • Hero Member
  • *****
  • Administrator
  • *****
  • Join Date: Dec 2014
  • Location: Australia
  • Posts: 6648
  • Reputation Power: 0
  • Chip has hidden their reputation power
  • Gender: Male
  • Last Login:Today at 05:29:51 AM
  • Deeply Confused Learner
  • Profession: IT Engineer
A computer that understands how you feel
« on: July 27, 2019, 12:54:16 PM »
source: https://neurosciencenews.com/computer-emotion-understanding-14581/

Whilst looking into Identifying Emotions on the Basis of Neural Activation, I found this  email dated July 26, 2019:

A computer that understands how you feel

EmoNet, a new convolutional neural network, can accurately decode images into eleven distinct emotional categories. Training the AI on over 25,000 images, researchers demonstrate image content is sufficient to predict the category and valence of human emotions.

Could a computer, at a glance, tell the difference between a joyful image and a depressing one ?

EDIT: I think that my kind of computer should be able tell me funny jokes and make me smile too. Give me mo' o' that Artificial Silliness material, please ...

Could it distinguish, in a few milliseconds, a romantic comedy from a horror film?

Yes, and so can your brain, according to research published this week by University of Colorado Boulder neuroscientists.

“Machine learning technology is getting really good at recognizing the content of images – of deciphering what kind of object it is,” said senior author Tor Wager, who worked on the study while a professor of psychology and neuroscience at CU Boulder. “We wanted to ask: Could it do the same with emotions? The answer is yes.”

Part machine-learning innovation, part human brain-imaging study, the paper, published Wednesday in the journal Science Advances, marks an important step forward in the application of “neural networks” – computer systems modeled after the human brain – to the study of emotion.

It also sheds a new, different light on how and where images are represented in the human brain, suggesting that what we see – even briefly – could have a greater, more swift impact on our emotions than we might assume.

“A lot of people assume that humans evaluate their environment in a certain way and emotions follow from specific, ancestrally older brain systems like the limbic system,” said lead author Philip Kragel, a postdoctoral research associate at the Institute of Cognitive Science. “We found that the visual cortex itself also plays an important role in the processing and perception of emotion.”

THE BIRTH OF EMONETWiki

For the study, Kragel started with an existing neural network, called AlexNet, which enables computers to recognize objects. Using prior research that identified stereotypical emotional responses to images, he retooled the network to predict how a person would feel when they see a certain image.

He then “showed” the new network, dubbed EmoNet, 25,000 images ranging from erotic photos to nature scenes and asked it to categorize them into 20 categories such as craving, sexual desire, horror, awe, and surprise.

EmoNet could accurately and consistently categorize 11 of the emotion types. But it was better at recognizing some than others. For instance, it identified photos that evoke craving or sexual desire with more than 95 percent accuracy. But it had a harder time with more nuanced emotions like confusion, awe, and surprise.

Even a simple color elicited a prediction of an emotion: When EmoNet saw a black screen, it registered anxiety. Red conjured craving. Puppies evoked amusement. If there were two of them, it picked romance. EmoNet was also able to reliably rate the intensity of images, identifying not only the emotion it might illicit but how strong it might be.

When the researchers showed EmoNet brief movie clips and asked it to categorize them as romantic comedies, action films or horror movies, it got it right three-quarters of the time.

WHAT YOU SEE IS HOW YOU FEEL

To further test and refine EmoNet, the researchers then brought in 18 human subjects.

As a functional magnetic resonance imaging (fMRIWiki) machine measured their brain activity, they were shown 4-second flashes of 112 images. EmoNet saw the same pictures, essentially serving as the 19th subject.


 
It also sheds a new, different light on how and where images are represented in the human brain, suggesting that what we see – even briefly – could have a greater, more swift impact on our emotions than we might assume.
[/size]

When activity in the neural network was compared to that in the subjects’ brains, the patterns matched up.

“We found a correspondence between patterns of brain activity in the occipital lobe and units in EmoNet that code for specific emotions. This means that EmoNet learned to represent emotions in a way that is biologically plausible, even though we did not explicitly train it to do so,” said Kragel.

The brain imaging itself also yielded some surprising findings. Even a brief, basic image – an object or a face – could ignite emotion-related activity in the visual cortex of the brain. And different kinds of emotions lit up different regions.

“This shows that emotions are not just add-ons that happen later in different areas of the brain,” said Wager, now a professor at Dartmouth College.

“Our brains are recognizing them, categorizing them and responding to them very early on.”

Ultimately, the researchers say, neural networks like EmoNet could be used in technologies to help people digitally screen out negative images or find positive ones.

It could also be applied to improve computer-human interactions and help advance emotion research.

The takeaway, for now, says Kragel: “What you see and what your surroundings are can make a big difference in your emotional life.”

EDIT: Don't forget what you feel or hear either like doubt, rumour, innuendo and music !
friendly
0
funny
0
informative
0
agree
0
disagree
0
like
0
dislike
0
No reactions
No reactions
No reactions
No reactions
No reactions
No reactions
No reactions
Our Discord Server invitation link is https://discord.gg/jB2qmRrxyD

Tags:
 

Related Topics

  Subject / Started by Replies Last post
1 Replies
6216 Views
Last post March 20, 2016, 07:25:13 AM
by 10kites
4 Replies
6541 Views
Last post March 23, 2016, 10:17:49 PM
by MoeMentim
13 Replies
1804 Views
Last post November 06, 2016, 04:31:37 AM
by Chip
0 Replies
5787 Views
Last post March 06, 2018, 03:59:54 AM
by Chip
0 Replies
3542 Views
Last post November 06, 2018, 04:18:32 PM
by Chip
0 Replies
4615 Views
Last post May 28, 2019, 03:48:43 AM
by smfadmin
0 Replies
3676 Views
Last post June 14, 2019, 04:39:29 PM
by Chip
0 Replies
3944 Views
Last post June 15, 2019, 09:26:18 AM
by Chip
0 Replies
3992 Views
Last post July 23, 2019, 08:14:37 AM
by Chip
0 Replies
4127 Views
Last post October 29, 2019, 05:13:27 AM
by Chip


dopetalk does not endorse any advertised product nor does it accept any liability for it's use or misuse





TERMS AND CONDITIONS

In no event will d&u or any person involved in creating, producing, or distributing site information be liable for any direct, indirect, incidental, punitive, special or consequential damages arising out of the use of or inability to use d&u. You agree to indemnify and hold harmless d&u, its domain founders, sponsors, maintainers, server administrators, volunteers and contributors from and against all liability, claims, damages, costs and expenses, including legal fees, that arise directly or indirectly from the use of any part of the d&u site.


TO USE THIS WEBSITE YOU MUST AGREE TO THE TERMS AND CONDITIONS ABOVE


Founded December 2014
SimplePortal 2.3.6 © 2008-2014, SimplePortal