dopetalk does not endorse any advertised product nor does it accept any liability for it's use or misuse

This website has run out of funding so feel free to contribute if you can afford it (see footer)

Author Topic: Facebook says it can read thoughts with mind-reading device (and more on MR)  (Read 4590 times)

Offline Chip (OP)

  • Server Admin
  • Moderator
  • Hero Member
  • ***
  • Administrator
  • *****
  • Join Date: Dec 2014
  • Location: Australia
  • Posts: 6648
  • Reputation Power: 0
  • Chip has hidden their reputation power
  • Gender: Male
  • Last Login:Today at 05:53:23 PM
  • Deeply Confused Learner
  • Profession: IT Engineer
source: https://www.independent.co.uk/life-style/gadgets-and-tech/news/facebook-mind-reading-brain-machine-interface-neuralink-a9028801.html

I wonder what's really going on in the undocumented and covert versions of this ? I'm sure that this technology is far more advanced but not in the public domain yet.

FACEBOOK SAYS IT CAN READ THOUGHTS WITH MIND-READING DEVICE

31 July 2019



Facebook is working on a headset that can transfer a person’s thoughts directly onto a computer screen using a brain-machine interface.

A paper describing the technology, published in Nature Communications, reveals how the headset is able to decode brain activity to instantly transcribe what a person is saying into text on a computer screen.

The algorithm that decodes the activity is currently only able to recognise a small set of words and phrases but the technology giant said that its non-invasive, wearable device could one day allow people with paralysis to communicate.

The technology could also transform the utility of augmented reality glasses and virtual reality headsets through thought-based controls.

“Being able to recognise even a handful of imagined commands, like ‘home’, ‘select’, and ‘delete’ would provide entirely new ways of interacting with today’s VR systems – and tomorrow’s AR glasses,” Facebook wrote in a blog post describing the brain-computer interface device.

“Imagine a world where all the knowledge, fun, and utility of today’s smartphones were instantly accessible and completely hands-free... A decade from now, the ability to type directly from our brains may be accepted as a given. Not long ago, it sounded like science fiction. Now, it feels within plausible reach.”

Facebook first revealed its ambitions to read people’s minds in 2017 at its annual F8 conference, when Regina Dugan took to the stage and asked the question: “What if you could type directly from your brain?”

Since then, the company has been building a headset to make this a reality through its Facebook Reality Labs, as well as with collaborations with some of the world’s leading universities.

It is not the only firm working on brain-machine interfaces with the hope of one day commercialising the technology.

Earlier this month, Elon Musk-founded startup Neuralink revealed its own device that can connect human brains directly to computers.

The key difference between Neuralink’s “threads” and Facebook’s headset is that Facebook’s device is non-invasive and does not require any form of surgery.

It works instead by measuring brain activity from receivers placed around a person’s head.

Given Facebook’s track record with its users’ privacy, the researchers pointed out that it was important to take safety and security into consideration when developing the device.

“We can’t anticipate or solve all of the ethical issues associated with this technology on our own. What we can do is recognise when the technology has advanced beyond what people know is possible, and make sure that information is delivered back to the community,” said Mark Chevillet, director of the brain-computer interface program at Facebook Reality Labs.

“Neuroethical design is one or our programme’s key pillars - we want to be transparent about what we’re working on so that people can tell us their concerns about this technology.”



source: https://fortune.com/2019/05/07/artificial-intelligence-mind-reading-technology/

A.I. Can Now Read Your Thoughts—And Turn Them Into Words and Images



Artificial intelligence played an essential role in helping a paraplegic patient walk by interpreting the electrical signals emitted by the body. Now that mind-reading technology is also possible, A.I. brain implants will likely be next


A recent article in Nature highlights a discovery that pushes the boundaries of our imaginations and challenges some of the very attributes that make us human.

The piece details how artificial intelligence is creating speech by interpreting brain signals (and even offers an audio recording for a chance to hear it for yourself). It’s a key advancement for people who can’t speak because it provides a direct technologically-enabled path from thought to speech.

The implications of this discovery go beyond the recreation of speech: A.I. was used to decode brain waves and then reassemble neural impulses. While the focus of this study was on the mechanistic components of speech, such as direct muscle movement, it still acquired information from the early stages of thought development to construct words that were identifiable about 70% of the time. In other words, A.I. actually translated the code that makes up pre-speech.

A.I. has also enabled the recreation of another sense through the reading of neural output: vision. In a recent study, for instance, functional magnetic resonance imaging (fMRI) data was combined with machine learning to visualize perceptual content from the brain. Image reconstruction from this brain activity—which was translated by A.I.—recreated images on a computer screen that even the casual observer could recognize as the original visual stimuli.

But here’s where it gets really interesting: These advancements create the potential for a new level of direct communication mediated not by humans but by A.I. and technology.

Steps are currently being taken to transition such technology from research to real-life application. The utility of an electronic mesh—a microscopic network of flexible circuits that are placed into the brain and insulated with actual nerve cells—is being tested in animals now.

Even Elon Musk has entered the endeavour of processing impulses directly to and from the human brain. His company Neuralink is currently developing an interface between the computer and brain using neural lace technology—a cellular infrastructure that allows microelectrodes to be incorporated into the structure of the brain itself.

What lies ahead is more of the blurred distinction between man and machine: A.I. may soon find a new home as less of an external device and more of a neuromechanical biological system that lives within our bodies.

The codification of speech and vision into pre-sensory data and the potential creation of miniature, biologically-compatible interfaces will drive a new vista for biology and technology where the sum of the parts—human and electronic—combine to transcend the limitations of the cell and the electron.



source: http://www.newworldwar.org/mindreading.htm

Mind-Reading

History

Probably no more “intrusive and persistent” method of obtaining information about a person exists than reading their mind.1 Research on mind-reading has been vigorously pursued by US government agencies and various academic centers since the 1970s, and continues to this day.

Since 1973 DARPA has been studying mind-reading with EEG hooked to computers, using scientists at the University of Illinois, UCLA, Stanford Research Institute, Massachusetts Institute of Technology, and the University of Rochester.

They developed a system that could determine how a person perceived colors or shapes and were working on methods to detect daydreaming, fatigue, and other brain states. Although the device had to be calibrated for each person’s brain by having them think a series of specific thoughts, the calibration was quick.

In 1974 another very basic mind-reading machine was created by researchers at Stanford Research Institute. It used an EEG hooked to a computer which allowed a dot to be moved across a computer screen using thought alone. When interpreting people’s brainwaves, it was right about 60% of the time. During these tests scientists discovered that brain patterns are like fingerprints, each person has their own. So, each computer would have to be calibrated for a specific person.

Another method to address this issue was to store a large amount of generic patterns on the computer, so when it encountered a brain pattern it didn’t recognized, it used one that most resembled it. Since then, DARPA has sponsored Brain-Computer Interface (BCI) and mind-reading programs at Duke University, MIT, University of Florida, and New York State University, Brooklyn.

The Human Computer Interaction group at Tufts University has studied mind-reading funded by grants from a government research and education agency known as the National Science Foundation (NSF). Carnegie Mellon University, Stanford University, and the MIT Sloan School of Management have studied mind-reading. The Computer Laboratory at the University of Cambridge in England has developed mind-reading machines based on facial expressions.

Other academic institutions that have participated in mind-reading projects include the University of California, Berkeley, University of Maryland, and Princeton University in New Jersey. Microsoft has studied mind-reading using EEG to better accommodate its users. Emotiv Systems built a mind-reading gaming device which uses EEG to infer the mental states of video game players. Honda Motors and Advanced Telecommunications Research Institute International (ATR) have studied mind-reading.

Neuroimaging Devices

Scientists discovered that the neural code of the human brain is similar to the digital code of a computer. To some extent, they have deciphered this code. Prior to this, they assumed that it was necessary to identify the neurons associated with specific acts, which would have made mind-reading much more difficult.

They now understand that it’s not necessary to monitor billions of neurons to determine which are connected to a particular thought or act. Only a small number of them need to be monitored to accomplish this.

To monitor these neurons researchers use neuroimaging devices. They include event-related optical signal (EROS), functional magnetic resonance imaging (fMRI), electroencephalography (EEG), functional near-infrared imaging (fNIR), magnetoencephalography (MEG), and positron emission tomography (PET).

 These devices may be combined for a more accurate reading.

There are basically two types of measurements, direct methods and indirect methods. Direct methods measure changes in electromagnetic fields and currents around the brain which are emitted from the surface of the scalp, or they monitor the neurons themselves. Indirect methods measure hemodynamic (blood movement) changes of hemoglobin in specific tissue compartments.

Both of these methods are almost simultaneous with neuronal activity. Regarding sensors, there are invasive ones which must be implanted, and non-invasive ones which can be worn on the scalp, in the form of a headband.

Electroencephalography (EEG) provides a direct method for determining brain states and processes by measuring the electrical activity on the scalp produced by the firing of neurons in the brain. EEG has been around for over 100 years. EEG is commonly used in neuroscience, cognitive science, and cognitive psychology. It is inexpensive, silent, non-invasive, portable, and tolerates movement.

Wireless EEG which uses non-invasive sensors that have physical contact with the scalp can transmit the signals to a remote machine for deciphering. Although, in 1976 the Los Angeles Times reported that DARPA was working on an EEG to detect brain activity several feet from a person’s head, which was to be completed in the 1980s. EEG normally produces only a general indicator of brain activity.

However, in 2008 Discovery News reported that a company called Emotiv Systems developed an algorithm that decodes the cortex, providing a more accurate measurement. “We can calibrate the algorithm across a wide range of technologies with the same resolution you would get from placing an invasive chip inside the head,” said Tan Le, president of Emotiv Systems.

Functional magnetic resonance imaging (fMRI) measures the blood flow in the brain in response to neural activity. Active neurons use oxygen, which is brought to them by blood. The more active a region of the brain is the more blood flows in the area. This movement of blood is referred to as hemodynamic activity. FMRI can detect which areas are receiving blood, which indicates that they’re processing information.

The fMRI provides an indirect measurement of brain processes. It is the most common method of neuroimaging, and can produce 2 and 3-dimensional images. It is non-invasive, and can record signals from all brain regions, unlike EEG which focuses on the surface only.

Functional near-infrared imaging (fNIR) provides an indirect measurement of brain activity by detecting hemodynamic changes in the cortex. Although it is based on different principles, in that it uses light, it functions in the same manner as fMRI. FNIR can provide an almost continuous display of these changes in the cortex. It is inexpensive, non-invasive, and portable. A wireless headband with sensors exists for this device.

Event-related optical signal (EROS) is a brain-scanning device that focuses near-infrared light into the cerebral cortex to detect the density of neurons indicated by the transparency of brain tissue. Because it can only detect these changes a few centimeters deep, it can only image the cerebral cortex. Unlike fNIR, which is an optical method for measuring blood flow, EROS detects the intensity of neurons themselves and provides a direct measurement of brain activity. It is very accurate, portable, inexpensive, and non-invasive. A wireless headband with sensors exists for this device.

Capabilities

Mind-reading can be accomplished by first having a computer learn which brain patterns are associated with specific thoughts, then store the decoded information in a database. This machine learning is accomplished using a type of artificial intelligence (AI) called an algorithm. A very basic algorithm is a spell checker, which uses a database of common mistakes associated with a particular sequence of letters to present suggestions to a user.

“The new realization is that every thought is associated with a pattern of brain activity,” proclaimed neuroscientist John Dylan Haynes, in Newsweek International on February 4, 2008. “And,” says Haynes, “you can train a computer to recognize the pattern associated with a particular thought.”

In a January 2000 issue of US News and World Report, Lockheed Martin neuroengineer Dr. John Norseen announced, “Just like you can find one person in a million through fingerprints ... you can find one thought in a million.” This can be accomplished using AI and HCI, or what Dr. Norseen calls biofusion.

The decoded brain signals can be stored in a database. Then when someone is scanned, the computer detects the pattern and matches the signals to the database of known meanings. But it’s not necessary to scan a brain to decode its signals for every single thought, such as a picture.

Instead, after the machine has learned how to decipher patterns associated with specific thoughts such as images, more images can be added to the program and the computer can use the process it used for the other images as a model to somewhat accurately detect additional thoughts.

Both words and images can be detected using mind-reading devices with varying degrees of accuracy. This can occur for words and images being viewed by a person on an external display, such as a book, or words and images just being thought of with no external stimuli.

“It is possible to read someone’s mind by remotely measuring their brain activity,” announced New Scientist in their Mind-Reading Machine Knows What You See article of April of 2005. The Computational Neuroscience Laboratories at the Advanced Telecommunications Research Institute International (ATR) in Kyoto Japan, and Princeton University in New Jersey, proved that by monitoring the visual cortex with fMRI they could determine which basic objects (sets of lines) a person was looking at.

When the objects were combined, they could even determine which one was being focused on. According to the scientists, it may be possible not only to view but also to record and replay these images. They announced that the technology could be used to figure out dreams and other secrets in people’s minds.

Vanderbilt University in Nashville has conducted simple mind-reading tests using an fMRI/Computer, which learned what basic images a group of test subjects was looking at. They were able to predict with 50% accuracy which objects the test subjects were thinking of when they were asked only to remember what they had seen, without being shown the images.

On March 6, 2008 ABC News reported that neuroscientists at the University of California at Berkeley accomplished mind-reading by monitoring the visual cortex with an fMRI connected to a self-learning (artificial intelligence) computer program.

First, they used 1750 pictures to build a computational database for the computer to learn with by flashing the pictures in front of test subjects connected to an fMRI. This allowed the algorithm to decipher the brain patterns and associate them with the images.

In addition to deciphering these brain patterns, the computer recorded the process that it used to accomplish this, and built a model based upon it. Then, without scanning the test subjects, they added 120 new pictures to the program and allowed it to create its interpretation of what the new brain signals would be, based on the previous model.

Then they had the test subjects look at these pictures which they had never seen while being scanned. The computer predicted what they were looking at 72% of the time. The scientists announced that the model could be used as a basis to predict the brain activity associated with any image.

What this means is that it’s not necessary to scan a brain to obtain the meaning of each signal. Once the model had been developed, they could simply add new pictures to the database/dictionary. The scientists suggested that out of 1 billion pictures, the computer would be accurate about 20% of the time.

Images which are not consciously seen by a person can even be detected by mind-reading machines. Researchers at University College London flashed pictures in quick succession to test subjects connected to an fMRI. Although some of these pictures were invisible to the subjects, they were accurately recorded 80% of the time by the computer.

Like a fingerprint, each person has their own brainprint. Therefore, calibration for each brain is necessary. This is accomplished by having the person think a series of specific thoughts. In the case of EEG, this calibration can take less than a minute. However, because the signals which represent thoughts are similar from one person to the next, a universal mind-reading database has been suggested.

Using fMRI, scientists at Carnegie Mellon University (CMU) discovered that the brain patterns associated with specific thoughts are quite similar among multiple people. This, they stated, would provide the opportunity to create a universal mind-reading dictionary.

Scientists at the University of California at Berkeley mentioned that a “general visual decoder” would have great scientific use. Likewise, the brain patterns associated with specific words that occur when people are reading are also basically the same. This similarity of brain functions associated with words seems to have been an evolutionary development which allowed for an advantage in communication.

A mind-reading machine capable of determining the brain pattern associated with a specific word was developed by scientists at CMU. Brain scans using fMRI were taken of test subjects who were given a variety of words to think of in order to train the computer. An important consideration here is that they were not viewing these words on an external display, only thinking about them. After the computer identified the brain patterns associated with those words, the subjects were given two new words to think about, which the computer accurately determined.

Although, in this particular study only a couple of words were tested, it proves that after a model of how to decipher brain signals was created, AI could accurately determine new words that subjects were thinking about. “These building blocks could be used to predict patterns for any concrete noun,” proclaimed Tom Mitchell of the Machine Learning Department.

In February of 2004 Popular Science announced that a mind-reading computer could, in theory, translate a person’s working verbal memory onto a computer screen. “You could imagine thinking about talking and having it projected into a room 2,000 miles away,” says Professor Craig Henriquez at Duke University’s Center for Neuroengineering, who has studied mind-reading for DARPA. He added, “It’s very, very possible.”

FMRI can be used to determine if someone is reading or writing. Neuroscientists can determine when a person is reading by monitoring their brainwaves. They can almost determine exactly what they’re reading. And because these patterns are similar from one person to the next, a universal device for determining what people are reading is possible.

In March of 2008 both Technology Review and ABC News revealed that an fMRI could in theory be used to display a person’s dreams. Then in December of 2008, scientists at ATR in Kyoto Japan announced that they developed a technology that would eventually allow them to record and replay a person’s dreams.

Emotions from love to hate can be recognized by neuralimagingWiki. The level of stress a person is experiencing can also be measured. Brain states such as honesty, deception, and even self deception, can be measured.

Patterns associated with decisions can also be read. Scientists from CMU, Stanford University, and the MIT Sloan School of Management were able to accurately predict the purchasing decisions of test subjects in a virtual shopping center. They monitored the subject’s level of interest in a product as well as their decision to purchase it.

Neuroimaging can also detect decisions about how someone will later do a high-level mental activity.

Neuroimaging can be used to determine if someone is speaking or reading. It can be used to detect areas of the brain that are active when someone is hearing a sound, or touching an object.

Brain patterns associated with specific physical movements, such as a finger, can be deciphered with neural imaging. The mere intention to make a physical movement can be detected before the actual movement is made.

Cameras

A type of mind-reading is possible with cameras connected to computers. One such device, called the Emotional Social Intelligence Prosthetic (ESP), was developed at the MIT Media Laboratory in 2006. It consists of a tiny camera that can be worn on a hat, an earphone and a small computer that is worn on a belt. It infers a person’s emotional state by analyzing combinations of subtle facial movements and gestures.

When an emotional state is detected, the wearer is signaled through the earphone to adjust their behavior in order to gain the attention of the target. The computer can detect 6 emotional states. It can also be adjusted for cultural differences and configured specifically for the wearer.

Around this time, the Computer Laboratory at the University of Cambridge, UK, developed a similar camera-based mind-reading machine. It uses a computer to monitor, in real-time, combinations of head movement, shape, color, smiles, and eyebrow activity to infer a person’s emotional state.

It detects basic emotional states such as happiness, sadness, anger, fear, surprise, and disgust, as well as more complex states. It’s accurate between 65 and 90 percent of the time. “The mind-reading computer system presents information about your mental state as easily as a keyboard and mouse present text and commands,” they announced.

Used for Surveillance

Mind reading exists. The DOD and various institutions have vigorously researched this subject since at least the mid 1970s. “Mapping human brain functions is now routine,” declared US News and World Report, in an article entitled Reading Your Mind—And Injecting Smart Thoughts, of January of 2000.

Both words and images, being viewed or thought, can be mind-read. Various emotional states as well as mental processes such decision making, reading, writing, movement and the intention to make a movement, can be detected with mind-reading devices. Perceptions such as touch, sound, and light can also be detected.

The proposed uses for mind-reading technology are positive. Some include determining if people in comas can communicate, helping stroke patients and those who suffered brain injuries, aiding those with learning disorders, assisting with online shopping, and improving people’s communications skills.

However, other uses that have been suggested include the monitoring of unconscious mental processes, and interrogation of criminal suspects and potential terrorists. Dr. John Alexander mentioned that the recent developments in mind-reading technology would take surveillance to new levels by allowing investigators to “peer into the inner sanctum of the mind,” in order to determine if a suspect has caused, or will likely cause a crime.

Dr. Norseen has sent R&D plans to the pentagon to have tiny mind-reading devices installed at airports to profile potential terrorists. He suggested that these devices could be functional by 2005. In August of 2008, CNN stated that the US military’s knowledge obtained from mind-reading research could be used to interrogate the enemy.

Law Enforcement Technology announced in September of 2005 the existence of a new forensic technology known as Brain Fingerprinting, which has already been used in hundreds of investigations as a lie detector by the CIA, FBI and law enforcement agencies in the United States.

Brain Fingerprinting is admissible in court, because unlike a polygraph, which relies on emotional responses, it uses EEG to see how the brain reacts to words and pictures related to a crime scene. Dr. Larry Farwell, its inventor, says it is completely accurate. According to the report it will be used to help speed-up investigations.

Endnotes

1. Another possible method to obtain information is Remote Viewing. RV is the ability to produce correct information about people, events, objects, or concepts that are somewhere else in space and time, and are completely blind to the viewer collecting the information. It can be used to describe people or events, produce leads, reconstruct events, make decisions, and make predictions about the future.

See Remote Viewing Secrets by Joseph McMoneagle.

RV tests were conducted by the US government over a 20-year period during Project Stargate, a classified initiative by the CIA which began in 1972 and lasted until about 1994. Most of the 154 tests and 26,000 trials took place at the Cognitive Sciences Laboratory at Forte Meade, Maryland.

A majority of the results of the project are still classified. See the Journal of Parapsychology articles, Remote Viewing by Committee, September 22, 2003, by Lance Storm and Experiment One of the SAIC Remote Viewing Program, of December 1, 1998, by Richard Wiseman and Julie Milton. The success of the project varies depending on the source.

Allegedly the original tests were conducted under rigid scientific conditions, which had impressive results. However, the same sources describe RV in general as ineffective. See Discover Magazine's article, CIA ESP, on April 1, 1996, by Jeffrey Kluger, and the Washington Post's report, Many Find Remote Viewing a Far Fetch from Science, on December 2, 1995, by Curt Suplee. According to author McMoneagle, an original viewer during Project Stargate, it is accurate about 50 or 60 percent of the time. Nevertheless, RV will be used to obtain intelligence, according to John B. Alexander.

See The New Mental Battlefield, which appeared in the December 1980 issue of Military Review.

Also see the June 1998 Research Report Number 2 of the University of Bradford's Non-Lethal Weapons Research Project (BNLWRP), for how RV has been added to the NLW arsenal.

According to multiple sources, US government agencies are now using the consulting services of RV professionals. This was reported on January 9, 2002 in the University Wire's (Colorado Daily) article, Clairvoyant Discusses Reveals Details of Remote Viewing, by Wendy Kale, and in the Bulletin of the Atomic Scientists on September 1, 1994, in its report, The Soft Kill Fallacy by Steven Aftergood.

In his book Winning The War: Advanced Weapons, Strategies, and Concepts for the Post-911 World, Alexander had this to say regarding RV: "Since the beginning of history, humans have made anecdotal references to innate abilities to foretell the future, to know what was occurring at distant locations or the status of people separated from them, and to find resources they need without any traditional means of accessing that information."

He continued: "Studies have demonstrated beyond any doubt that these nontraditional capabilities exist. … [RV can] radically change our means of gathering intelligence. It holds the promise of providing information about inaccessible redoubts and advances in technology. More importantly, once these skills are understood, those possessing them will be able to determine an adversary's intent and be predictive about the events."

2. Because neuroimaging technology decodes brain patterns to thoughts, some argue that it technically doesn't read a person's mind. However, because specific thoughts and brain states can be deciphered, here it is referred to as mind-reading. Additionally, most mainstream documents refer to this as mind-reading, despite the fact that it is actually brainwave-reading.

3. Magnetoencephalography (MEG) and positron emission tomography (PET) can also be used to infer a person's neurophysiological state. But because they are impractical for field use due to their large size and harmful radiation, MEG and PET won't be considered here. However, DARPA is in the process of developing a small helmet-sized MEG device which would be connected to a portable computer. See the article Mind over Machine in the February 1, 2004 issue of Popular Science, by Carl Zimmer.



for more on this topic, click on Real-time decoding of question-and-answer speech dialogue using human cortical activity

« Last Edit: September 04, 2019, 06:23:31 AM by Chip »
friendly
0
funny
0
informative
0
agree
0
disagree
0
like
0
dislike
0
No reactions
No reactions
No reactions
No reactions
No reactions
No reactions
No reactions
Our Discord Server invitation link is https://discord.gg/jB2qmRrxyD

Tags:
 

Related Topics

  Subject / Started by Replies Last post
2 Replies
6702 Views
Last post August 25, 2015, 11:21:11 AM
by Sand and Water
11 Replies
11577 Views
Last post December 01, 2018, 12:39:48 AM
by bobbytarantino
0 Replies
4517 Views
Last post March 02, 2016, 01:10:19 AM
by Chip
0 Replies
4334 Views
Last post May 02, 2018, 05:51:00 PM
by Chip
0 Replies
4352 Views
Last post July 13, 2019, 07:41:56 AM
by Chip
0 Replies
4129 Views
Last post October 29, 2019, 04:58:41 AM
by Chip
0 Replies
4718 Views
Last post November 26, 2019, 04:52:43 PM
by Chip
0 Replies
4792 Views
Last post December 21, 2019, 06:07:14 PM
by emilywilliams2793
0 Replies
1544 Views
Last post December 26, 2022, 05:21:40 AM
by Chip
0 Replies
4700 Views
Last post August 27, 2023, 07:17:46 PM
by Chip


dopetalk does not endorse any advertised product nor does it accept any liability for it's use or misuse





TERMS AND CONDITIONS

In no event will d&u or any person involved in creating, producing, or distributing site information be liable for any direct, indirect, incidental, punitive, special or consequential damages arising out of the use of or inability to use d&u. You agree to indemnify and hold harmless d&u, its domain founders, sponsors, maintainers, server administrators, volunteers and contributors from and against all liability, claims, damages, costs and expenses, including legal fees, that arise directly or indirectly from the use of any part of the d&u site.


TO USE THIS WEBSITE YOU MUST AGREE TO THE TERMS AND CONDITIONS ABOVE


Founded December 2014
SimplePortal 2.3.6 © 2008-2014, SimplePortal