dopetalk does not endorse any advertised product nor does it accept any liability for it's use or misuse

This website has run out of funding so feel free to contribute if you can afford it (see footer)

Recent Posts

Pages: 1 ... 78910
91
https://www.livescience.com/technology/artificial-intelligence/us-air-force-wants-to-develop-smarter-mini-drones-powered-by-brain-inspired-ai?utm_term=032043BB-1CB4-4440-A845-2FF7DCCBD37B&lrh=1e7f7a9239bb44f191dc979b8fe5e634e587dfe020b84a653d2040468a8b342b&utm_campaign=368B3745-DDE0-4A69-A2E8-62503D85375D&utm_medium=email&utm_content=83303A6A-B062-4168-AED9-969C76D730B6&utm_source=SmartBrief

US Air Force wants to develop smarter mini-drones powered by brain-inspired AI chips

May 5 2025

Plans are underway to create new AI-powered drones that can fly for much longer than current designs


* itNaNmkkKnSUjPxXWcKiPk-970-80.jpg.webp (38.34 kB . 970x546 - viewed 155 times)
Although neuromorphic computing was first proposed by scientist Carver Mead in the late 1980s, it is a field of computer design theory that is still in development. (Image credit: Anton Petrus/Getty Images)

Scientists are developing an artificial intelligence (AI) chip the size of a grain of rice that can mimic human brains — and they plan to use it in miniature drones.

Although AI can automate monotonous functions, it is resource-intensive and requires large amounts of energy to operate.

Drones also require energy for propulsion, navigation, sensing, stabilization and communication.

Larger drones can better compensate for AI's energy demands by using an engine, but smaller drones rely on battery power — meaning AI energy demands can reduce flying time from 45 minutes to just four.

But this may not be a problem forever; Suin Yi and his team at the University of Texas have been awarded funding by the 2025 Air Force Office of Scientific Research Young Investigator Program (part of the Air Force Office of Scientific Research) to develop an energy-efficient AI for drones.

Their goal is to build a chip the size of a grain of rice with various AI capabilities — including autonomous piloting and object recognition — within three years.

AI-powered miniature drones:

To build a more energy-efficient AI chip, the scientists propose using conducting polymer thin films.

These are (so far) an underused aspect of neuromorphic computing; this is a computer system that mimics the brain’s structure to enable highly efficient information processing.

The researchers intend to replicate how neurons learn and make decisions, thereby saving energy by only being used when required, similar to how a human brain uses different parts for different functions.

Although neuromorphic computing was first proposed by scientist Carver Mead in the late 1980s, it is a field of computer design theory that is still in development.

In 2024, Intel unveiled their Hala Point neuromorphic computer, which is powered by more than 1,000 new AI chips and performs 50 times faster than conventional computing systems.

Meanwhile, the Joint Artificial Intelligence Center develops AI software and neuromorphic hardware.

Their particular focus is on developing systems for sharing all sensor information with every member of a network of neuromorphic-enabled units.

This technology could allow for greater situational awareness, with applications so far including headsets and robotics.

Using technology developed through this research, drones could become more intelligent by integrating conducting polymer material systems that can function like neurons in a brain.

If Yi’s research project is successful, miniature drones could become increasingly intelligent.

An AI system using neuromorphic computing could allow smaller and smarter automated drones to be developed to provide remote monitoring in confined locations, with a much longer flying time.
92
Astronomy / Do other planets have seasons?
« Last post by smfadmin on May 06, 2025, 05:04:09 PM »
https://www.livescience.com/space/do-other-planets-have-seasons?utm_medium=referral&utm_source=pushly&utm_campaign=Clicked%20Last%2090

May 5, 2025

Earth has four seasons, but do other planets in our solar system also have hot summer days and cold winter nights?


* GXSQAhTthhTepjoeZ5dr4H-970-80.jpg (1).webp (45.7 kB . 970x546 - viewed 174 times)
Earth has four seasons due to its 23.5-degree tilt and the shape of its orbit around the sun. But do other planets have seasonal patterns too? (Image credit: ANDRZEJ WOJCICKI/SCIENCE PHOTO LIBRARY via Getty Images)

Every year, Earth follows a familiar pattern of seasonal changes: As summer rolls around in the Northern Hemisphere, winter creeps in in the Southern Hemisphere, and vice versa.

But do other planets have seasonal patterns, too?

Indeed, other planets, dwarf planets and moons in our solar system do have seasonal cycles — and they can look wildly different from the ones we experience on Earth, experts told Live Science.

To understand how other planets have seasons, we can look at what drives seasonal changes on our planet. "The Earth has its four seasons because of the spin axis tilt," Gongjie Li, an astrophysicist at Georgia Tech, told Live Science. This means that our planet rotates at a slight angle of around 23.5 degrees.

So, when Earth is on one side of the sun, the Northern Hemisphere is pointed toward the sun and the Southern Hemisphere is facing away, explained Shane Byrne, a planetary science professor at the University of Arizona. When our planet moves to the other side of the sun six months later, the Northern Hemisphere faces away.

Mars has an axial tilt of around 25 degrees. Because this value is so close to Earth's tilt of 23.5 degrees, the amount of seasonal variation on Mars is similar to our planet's.

"Just like on the Earth, you have permanent darkness and permanent daylight at the polar areas, depending on whether you're in the winter or the summer; then you can flip between those two states every half a year," Byrne told Live Science. But interestingly, instead of water-based ice, winters on Mars are dominated by carbon dioxide ice (or dry ice), which can form spidery cracks on the surface on the planet.

In contrast, some planets in our solar system have massively different tilts, leading to more extreme changes throughout the year. For instance, Mercury has no tilt, so "there's almost no seasonal change at all," Byrne said. On the opposite end of the spectrum, Uranus has a 90-degree tilt, so the poles are either completely facing the sun, or not at all, meaning its seasons are intense: Summers are filled with long stretches of constant blazing sun, while winters plunge into perpetual chilling darkness.

But tilt isn't the only factor that drives seasonal changes.

The shape of a planet's orbit can also influence seasonal variations in weather, because planets' orbits tend to be ellipses, rather than perfect circles, so as a result, some planets are sometimes very far from the sun and very close to it at other times.

For example, Mercury has an "eccentric" orbit, which contributes to its seasonal variations, Li said. Pluto, too, has a very elliptical orbit that pushes its variations to an extreme, Byrne said.

These two factors — a planet's tilt and the shape of its orbit — can also change over time. Byrne, who studies climate records on Mars, explained that the tilt on the Red Planet wasn't always 25 degrees.

In fact, models, published in the journal Earth and Planetary Science Letters in 2018, have shown that Mars's tilt has varied from 10 to more than 40 degrees over the course of billions of years. This has led to extreme fluctuations in the planet's yearly cycle.

"So it's almost just random chance that it happens to be similar to the Earth today," Byrne said.

"On Earth, we're very lucky, this spin axis is quite stable," Li said. Due to this, we've had relatively stable seasonal cycles that have persisted for millennia, although the broader climate sometimes shifts as the entire orbit of Earth drifts further or closer from the sun.

Such stability has likely helped life as we know it develop here, Li said. Scientists like her are now studying planetary conditions and seasonal changes on exoplanets to see whether life could exist in faroff worlds. For now, it seems as though the mild seasonal changes and stable spin tilts on Earth are unique.
93
https://www.livescience.com/technology/computing/quantum-miracle-material-can-store-information-in-a-single-dimension-thanks-to-newly-discovered-magnetic-switching?utm_medium=referral&utm_source=pushly&utm_campaign=All%20Push%20Subscribers

Quantum 'miracle material' can store information in a single dimension thanks to newly discovered magnetic switching

May 5th, 2025

Scientists have developed a method for storing quantum information in a single dimension, thereby reducing decoherence, using chromium sulfide bromide.


* JfotNfANuAtzgBRHzfTXb3-970-80.jpg.webp (83.35 kB . 970x546 - viewed 199 times)
Magnetic switch traps quantum information carriers in one dimension. (Image credit: Brad Baxley, Part to Whole. For use reporting on this study, DOI: 10.1038/s41563-025-02120-1)

Scientists have discovered how to use a quantum material to tap into the power of magnetism to store quantum information — thanks to its capacity to support magnetic switching (when the magnetic polarization switches direction). They say it can lead to more viable quantum computing and sensing, thanks to much longer-lasting quantum states.

Chromium sulfide bromideWiki is an unusual material that has been likened to filo pastry (thin, folded layers of pastry) thanks to its structure of just a few layers of atoms.

Scientists consider it extremely promising for quantum devices because many of its properties can be used for any type of information storage.

It can be used to store information using an electric charge, as photons (as light), through magnetism (through the electronic spin) and even via phononsWiki — like vibrations from sound.

One of the many ways in which chromium sulfide bromide could be used to store information is through excitons — quasi-particles that form when an electron and its hole become bound together.

When a photon is moved from its grounded energy state, it effectively leaves behind a hole where it once was. Although they are separated, the photon and the hole remain paired together and become known as an exciton.

Previous research has highlighted how these excitonsWiki can sometimes form in a straight line in the material. But these excitons also exhibit unusual magnetic properties.

At temperatures less than 132 Kelvin (-222 degrees F or -141 degrees C), the material's layers are magnetized and the electrons are aligned,while the direction of the magnetic field switches for each layer in the material.

When chromium sulfide bromide is warmed to more than 132 K, the material loses its magnetization as the electrons can move in random directions.

In this unmagnetized state, the excitons are no longer trapped and extend over multiple layers of the material.

However, when chromium sulfide bromide is only a single atom thick, the excitons are confined to a single dimension. When used in a quantum device, this restriction could allow quantum information in the excitons to be stored much longer than it would otherwise be, as the excitons are less likely to collide with each other and lose the information they carry through decoherence (the loss of quantum information due to interference).

Quantum information in one dimension:

In the new study published Feb. 19 in the journal Nature Materials, scientists reported that they had produced excitons in chromium sulfide bromide by firing pulses of infrared light in 20 bursts lasting only 20 quadrillionths of a second (20 x 10-15).

They then used a second infrared laser to nudge the excitons into a higher energy state, before finding they had created two different variations of exciton when they should otherwise have had identical states of energy.

When the less energetic pulses were shot by lasers from different axes, the researchers discovered that the direction-dependent excitons could be confined to a single line or expanded into three dimensions.

The change from unidimensional; to three-dimensional excitons accounted for how long the excitons could last without colliding with each other.

"The magnetic order is a new tuning knob for shaping excitons and their interactions.

This could be a game changer for future electronics and information technology," said co-author of the study Rupert Huber, professor of experimental and applied physics at the University of Regensburg, Germany.

One of the key areas the research team wants to pursue next is to investigate whether these excitons could be converted to magnetic excitations in the electronic spin of the material.

Were they to achieve this, it could provide a useful method for converting quantum information between different subatomic particles (photons, excitons and electrons).

Switching between magnetized and non-magnetized states could provide a fast method for converting photon and spin-based quantum information.

The hope with chromium sulfide bromide is to harness all of its properties for use in future devices.

"The long-term vision is, you could potentially build quantum machines or devices that use these three or even all four of these properties: photons to transfer information, electrons to process information through their interactions, magnetism to store information, and phonons to modulate and transduce information to new frequencies," said co-author of the study Mackillo Kra, professor of electrical and computer engineering at the University of Michigan, in a statement.
94
https://theconversation.com/how-we-discovered-specific-brain-cells-that-enable-intelligent-behaviour-254233

How we discovered specific brain cells that enable intelligent behaviour

May 2, 2025 1.28am

For decades, neuroscientists have developed mathematical frameworks to explain how brain activity drives behaviour in predictable, repetitive scenarios, such as while playing a game. These algorithms have not only described brain cell activity with remarkable precision but also helped develop artificial intelligence with superhuman achievements in specific tasks, such as playing Atari or Go.

Yet these frameworks fall short of capturing the essence of human and animal behaviour: our extraordinary ability to generalise, infer and adapt. Our study, published in Nature late last year, provides insights into how brain cells in mice enable this more complex, intelligent behaviour.

Unlike machines, humans and animals can flexibly navigate new challenges. Every day, we solve new problems by generalising from our knowledge or drawing from our experiences. We cook new recipes, meet new people, take a new path – and we can imagine the aftermath of entirely novel choices.

It was in the mid-20th century that psychologist Edward Tolman described the concept of “cognitive maps”. These are internal, mental representations of the world that organise our experiences and allow us to predict what we’ll see next.

Starting in the 1970s, researchers identified a beautiful system of specialised cells in the hippocampus (the brain’s memory centre) and entorhinal cortex (an area that deals with memory, navigation, and time perception) in rodents that form a literal map of our environments.

These include “place cells”, which fire at specific locations, and “grid cells” that create a spatial framework. Together, these and a host of other neurons encode distances, goals and locations, forming a precise mental map of the physical world and where we are within it.


* file-20250430-56-kjf0g5.jpg (55.48 kB . 1200x458 - viewed 463 times)
Section of mouse hippocampus. Alexandros A Lavdas/Shutterstock

And now our attention has turned to other areas of cognition beyond finding our way around generalisation, inference, imagination, social cognition and memory.

The same areas of the brain that help us navigate in space are also involved in these functions.

Cells for generalising?

We wanted to know if there are cells that organise the knowledge of our behaviour, rather than the outside world, and how they work.

Specifically, what are the algorithms that underlie the activity of brain cells as we generalise from past experience? How do we rustle up that new pasta dish?

And we did find such cells. There are neurons that tell us “where we are” in a sequence of behaviour (we haven’t named the cells).

To uncover the brain cells, networks and algorithms that perform these roles, we studied mice, training the animals to complete a task.
The task had a sequence of actions with a repeating structure. Mice moved through four locations, or “goals”, containing a water reward (A, B, C and D) in loops.

When we moved the location of the goals, the mice were able to infer what came next in the sequence – even when they had never experienced that exact scenario before.

When mice reached goal D in a new location for the first time, they immediately knew to return to goal A. This wasn’t memory, because they’d never encountered it.

Instead, it shows that the mice understood the general structure of the task and tracked their position within it.

The mice had electrodes implanted into the brain, which allowed us to capture neural activity during the task. We found that specific cells in the cortex (the outermost layer of the brain) collectively mapped the animal’s goal progress. For example, one cell could fire when the animal was 70% of the way to its goal, regardless of where the goal was or how far away.

Some cells tracked progress towards immediate subgoals – like chopping vegetables in our cooking analogy – while others mapped progress towards the overall goal, such as finishing the meal.

Together, these goal progress cells created a system that gave our location in behavioural space rather than a physical space.

Crucially, the system is flexible and can be updated if the task changes. This encoding allows the brain to predict the upcoming sequence of actions without relying on simple associative memories.

Common experiences:

Why should the brain bother to learn general structural representations of tasks?

Why not create a new representation for each one?

For generalisation to be worthwhile, the tasks we complete must contain regularities that can be exploited — and they do.

The behaviour we compose to reach our goals is replete with repetition.

Generalisation allows knowledge to extend beyond individual instances. Throughout life, we encounter a highly structured distribution of tasks, and each day we solve new problems by generalising from past experiences.

A previous encounter with making bolognese can inform a new ragu recipe, because the same general steps apply to both (such as starting with frying onions and adding fresh herbs at the end). We propose that the goal-progress cells in the cortex serve as the building blocks – internal frameworks that organise abstract relationships between events, actions and outcomes.
While we’ve only shown this in mice, it is plausible that the same thing happens in the human brain.

By documenting these cellular networks and the algorithms that underlie them, we are building new bridges between human and animal neuroscience, and between biological and artificial intelligence. And pasta.



Shutterstock:
95
https://www.popularmechanics.com/science/a64636206/altered-consciousness-psychedelic/?source=nl&utm_source=nl_pop&utm_medium=email&date=050425&utm_campaign=nl02_050425_POP39603700&oo=&user_email=1e7f7a9239bb44f191dc979b8fe5e634e587dfe020b84a653d2040468a8b342b&GID=1e7f7a9239bb44f191dc979b8fe5e634e587dfe020b84a653d2040468a8b342b&utm_term=TEST-%20NEW%20TEST%20-%20Sending%20List%20-%20AM%20180D%20Clicks%2C%20NON%20AM%2090D%20Opens%2C%20Both%20Subbed%20Last%2030D

Psychedelic Trips Defy Words—That’s the Key to Unlocking Higher Consciousness, Scientists Say

May 02, 2025 4:12 PM

Psychedelic Trips Defy Words—That’s the Key to Unlocking Higher Consciousness

UNDER the care of a traditional Peruvian healer, serial entrepreneur Mark Gogolewski took a powerful Amazonian psychedelic as part of his healing process from alcoholism. As the Ayahuasca ceremony deepened, Gogolewski felt himself pulled to the brink of death, but also felt an encouragement to just let go and jump, he says. But something caught him—and it was “infinite love,” he says. “Like, you can imagine anything you might want—the beautiful, loving light, the source, whatever word you use—we touch it. It’s not just ineffable. It’s everything. ... And I will never forget it, because it was beyond anything I could have ever imagined. I can’t give you exact words, but I remember the feeling of those words.”

Gogolewski’s struggle to put the experience into words touches on a larger mystery: why do so many people who undergo altered states of consciousness find themselves unable to explain what they felt? Studies are revealing that these states may be fundamentally outside the bounds of human language. Or perhaps language itself is a filter—a cage, even—that blocks us from grasping deeper truths.

Dr. Dave Rabin, Ph.D., a psychiatrist and neuroscientist who studies psychedelics and trauma, believes the disappearance of language in psychedelic states is not a glitch—it’s the point. “Psychedelic experiences—whether they’re accessed through medicine augmentation, deep meditation, breathwork, or other non–drug-induced methods—can result in states of extraordinarily high levels of presentness,” Rabin says.

In those moments, he explains, the mind shifts away from ego and the past and enters a mode of “just listening to what’s happening in the moment,” he says. “Our language center requires higher cortical levels of processing [parts of the brain involved in planning, memory, and conscious thought] that draw from our past knowledge and experience,” Rabin says. “So, when we find ourselves in states of extraordinary presentness—whether psychedelic drugs are involved or not—these states can leave us with an absence of words, or what we call ineffability.”

This beyond-words feeling doesn’t hit us because language is broken, Rabin suggests, but because it’s temporarily irrelevant. Describing an experience, especially in the peak psychedelic moment, actually removes us from the experience, because “we’re putting it through a filter in our minds to describe it, to attempt to define it.”

Yet, it’s through language that we’ve built laws, literature, religion, and reason itself. Human civilization depends on our ability to preserve and transmit knowledge through structured, symbolic communication. As philosopher Ludwig Wittgenstein famously wrote, “The limits of my language mean the limits of my world.” Which, in a way, goes both ways. Language expands our reality—or quietly narrows it, too.

A 2024 study published in Social Psychological and Personality Science found that language doesn’t just express ideas; it encodes and spreads our attitudes across cultures and centuries, even the ones we don’t realize we have. Another 2024 paper, published on the preprint server arXiv and titled The Age of Spiritual Machines, offers striking evidence that reducing attention to language may itself induce altered states of consciousness, even in artificial intelligence (AI) models. When researchers dampened the language-processing functions of AI systems, the models began to resemble disembodied, ego-less, and unitive states—in short: the AIs tripped.

Michael Valdez, MD, a neurologist, addiction specialist, and medical director of Detox California, agrees that altered states reshape how language functions, but from a different angle. “Whether it is achieved through meditation, psychedelics, sleep deprivation, or trauma … language becomes less literal and more symbolic or metaphoric, as words become links to emotions that are felt rather than thought of.” He notes that during an altered state of consciousness, the experience of time, space, and reality can shift dramatically. So can the way people speak, leading to “fragmented and disjointed thoughts.” But in Valdez’s view, this is not linguistic failure—it’s a poetic reorientation.

In these moments, language stops being strictly logical and begins to resemble emotion in verbal form—metaphoric, symbolic, and affective, Valdez says. And while the words may sound jumbled on the surface, at their core they may open a path toward insight: “A new way of seeing, and perhaps, a new way of being,” Valdez says.

For Gogolewski, who wrote the book How to Be OK (When You’re Supposed to Be OK But You’re Not), the challenge of expression didn’t end with the February 2024 ayahuasca ceremony. For the last eight years, he has been studying Kabbalah and Buddhism, and he has found that words often fail in the face of symbols and metaphors rooted in ancient traditions. “The Buddhists would use these phrases that were impossible to understand purely with the mind. You’d have to wrestle with them before you could get an answer. Like, one I love right now is: ‘How you do one thing is how you do everything.’”

It could be Buddhism, Sufism in Islam, or Christian mysticism—“it doesn’t really matter,” Gogolewski says. What matters is the “spiritually rigorous vocabulary” that helps people in groups talk about things that might otherwise remain beyond “commoner” everyday language.

He has spent years trying to find better ways to describe what he experienced in that psychedelic ceremony—and still can’t. “I’m just going to spend the rest of my life trying to figure out better words,” Gogolewski says.



Getty Images
96
Alcohol & Tobacco / Why Alcohol Hits Women Harder
« Last post by Chip on May 05, 2025, 10:46:02 AM »
"Why Alcohol Hits Women Harder"

 https://neurosciencenews.com/women-aud-neuroscience-28810/#:~:text=Why%20Alcohol%20Hits%20Women%20Harder

May 4, 2025

Summary:

Alcohol use among women has surged to match men’s rates, but women face far greater health risks even at lower consumption levels. A growing body of research is uncovering key neurobiological sex differences that influence why and how women drink, with stress being a more prominent motivator for women.

Scientists are now investigating brain circuits, neuroimmune responses, and hormonal influences to better understand these differences and to develop more tailored, effective treatments. This work is especially urgent, as alcohol-related deaths and health complications are rising faster in women, yet existing treatments have been largely designed and tested on men.

Key Facts:

● Biological Vulnerability: Women metabolize alcohol differently, leading to higher blood alcohol levels and greater health risks at lower doses.

● Brain Differences: Women’s brains show distinct neuroimmune responses and stress-driven pathways linked to alcohol use disorder.

● Tailored Treatments Needed: Current alcohol use disorder medications are based on male-focused research, underscoring the need for sex-specific therapeutics.

Alcohol use disorder is a chronic disease that used to disproportionately affect men. But for the first time in history, women are catching up.

Today, women in the United States are drinking and engaging in harmful alcohol use at rates on par with their male counterparts.

Historically, research on alcohol use disorder—characterized by an impaired ability to control alcohol use despite negative consequenses—has focused on men.

“Alcohol use disorder is incredibly heterogenous,” says Sherry McKee, PhD, professor of psychiatry at Yale School of Medicine (YSM) and director of the Yale program.

“Not every medication is going to work for every person. And one of the key pieces that’s been missing in our research is a focus on sex and gender.”

Alcohol use is a growing women’s health issue
Last year, the Centers for Disease Control and Prevention identified a disturbing trend—between 2016 and 2021, rates of alcohol-related deaths increased by 35% in women (and 27% in men). Some researchers hypothesize that changes in social norms may be one of the drivers of increased drinking in women.

“Women are now earning more, delaying marriage, and delaying childbirth,” says McKee.

“It’s thought that this might create more time and space for drinking.”

And, even among those who are married and starting families, alcohol companies have increasingly focused their marketing on women. Terms like “mommy juice” have grown in popularity, and gendered alcoholic drinks such as “Mom Water” have appeared on shelves in liquor stores.

“We’ve seen that marketing toward moms has normalized ‘wine mom culture,’” says Kelly Cosgrove, PhD, professor of psychiatry, of neuroscience, and of radiology and biomedical imaging at YSM.

Research shows that the COVID-19 pandemic exacerbated the rise in drinking. One 2020 study found that the number of days in which women reported heavy alcohol use—at least four drinks within a couple of hours—rose by 41%.

“It had to do with the amount of time that people were home and the stress that they were under,” explains Marina Picciotto, PhD, Charles B. G. Murphy Professor of Psychiatry at YSM.

The rise in alcohol use is especially concerning because women face greater drinking-related health risks at lower amounts of alcohol than men. “We refer to this as the risk-severity paradox,” says McKee.

Studies have found that women who drink are at a disproportionately greater risk for brain damage, cognitive deficits, various cancers such as breast cancer, cardiovascular issues, liver injury, and immune system dysfunction.

Drinking is also associated with greater risk of mental health issues and suicide, physical and sexual assault, and pregnancy- and perinatal-related complications. Furthermore, alcohol can cause hormonal imbalances and menstrual irregularities.

The number of alcohol-related deaths is not only rising faster among women, but also driven by lesser amounts of alcohol. Men need to drink at least 3.2 drinks per day to be at increased risk of premature death, whereas women only need to consume 1.8.

“Not even two drinks a day is putting a woman at significantly increased risk,” says McKee.

These disparities are reflected in the health care system. Emergency room visits related to alcohol use increased by 70% in women versus 58% in men between 2006 and 2014, and hospitalizations rose by 69% in women (compared to 43% in men) between 2000 and 2015.

Why does alcohol disproportionately impact women?

Women experience greater health risks from drinking, in part, because they metabolize alcohol differently than men.

Alcohol is not lipid-soluble. In other words, when we consume alcohol, it does not enter fat tissue—it only disperses into tissue that contains water.

Women tend to have a lower percentage of body water and more fat tissue than men. Thus, they have less fluid to dilute the alcohol, leading to a higher blood alcohol concentration (BAC).

Furthermore, the primary enzyme involved in the metabolism of alcohol—alcohol dehydrogenase—is as much as 40% less active in women.

As result, when a woman and a man of the same age and weight drink the same amount at the same rate, the woman will experience a greater BAC.

“Let’s say they’re both 150 pounds and 48 years old, and they consume three drinks in two hours,” McKee says.

“The man will be significantly under the legal drinking limit for driving, and the woman will be over the legal drinking limit—just because of this difference in how alcohol is metabolized.”

Current alcohol use disorder treatments don’t address underlying sex differences
As alcohol-related harms continue to grow among women, they are also less likely to seek treatment than men. McKee believes that societal stigma surrounding alcohol consumption is the primary reason for this reluctance.

But another obstacle is that medical providers do not consider the sex-related differences when diagnosing and treating alcohol use disorder.

Women, for example, make up only about 13% of participants studied in research on withdrawal, according to a review in Frontiers in Psychiatry.

Medication trials for treating alcohol use disorder have also historically been conducted primarily on male cohorts. Only 1% of study participants involved in research on disulfiram—which, in 1948, became the first of the three FDA-approved medications for alcohol abuse —were women.

Studies show that naltrexone, another FDA-approved medication, is more likely to have side effects in women such as nausea and sleep disturbances—making them less likely to stick with the treatment. Research has not revealed any sex differences related to acamprosate, the third FDA-approved medication.

The Yale Program on Sex Differences in Alcohol Disorder is investigating key differences between the underlying mechanisms of alcohol addiction in women compared to men. One of its missions is to create more effective therapeutics tailored to women.

Emerging research, for example, shows that drivers of alcohol use differ between sexes.

While men are more likely to drink to experience the positive aspects of alcohol, such as feelings of pleasure and connecting socially with others, women are more likely to drink to help manage stress.

“So, we’re developing medications that target stress pathophysiology,” says McKee.

This may be explained by key sex differences in the brain that emerge during adolescence, particularly differences in the interactions of three key neural systems—the prefrontal cortex, striatum, and amygdala.

1. The prefrontal cortex is responsible for executive functions such as impulse control and decision-making.

2. The striatum drives pleasure-seeking behaviors and processes rewards, and becomes activated when individuals engage in risk-taking scenarios.

3. The amygdala is associated with our emotional responses, especially those related to fear.

Brain imaging studies have shown that men have greater activation in the striatum, which makes them more likely to engage in risky behaviors.

* The prefrontal cortex of women, on the other hand, develops earlier than in men—acting as a brake on impulsivity.

However, women also show greater reactivity of the amygdala, which makes them more susceptible to anxiety or other mood disorders.

This may explain why women, unlike men, do not seek alcohol for the thrill of drinking, but rather to cope with negative emotions, McKee says.

These findings may also explain why women who suffered major stressors in early childhood, such as abuse or neglect, might be especially vulnerable to developing alcohol use disorder.

“The prefrontal cortex and the amygdala are key brain regions that interact with the experience of childhood adversity, childhood trauma, and stress,” says McKee.

“Our working hypothesis is that these experiences will lead to the internalization of disorders such as anxiety and depression, which then lead to alcohol use. A lot of data suggest that this is a common pathway for alcohol use disorder in women.”

A closer look at sex differences in the female brain
Yale School of Medicine researchers are now diving even deeper into understanding how differences in the brain may be uniquely driving alcohol use disorder in women. Cosgrove, for instance, is focused on the neuroimmune system.

“Most of us are familiar with the peripheral immune system—it keeps us healthy,” she says.

“But the brain has its own immune system, and we’ve been focused on that because it’s responsible for healthy brain function. It’s at the root of everything.”

Cosgrove’s team is using a radiotracer—a radioactive substance used in medical imaging—that binds to an important type of immune cell in the brain called microglia.

Microglia play a crucial role in maintaining brain health, and their dysfunction is associated with a range of neurodegenerative disease such as Alzheimer’s disease, Parkinson’s disease, and multiple sclerosis.

Using positron emission tomography scans, the researchers are investigating whether there are differences between the microglia of men and women with alcohol use disorder.

They are finding that women with alcohol use disorder have a greater deficit of microglia than their male counterparts.

The findings, Cosgrove says, are not surprising. The fact that women are more likely than men to suffer from autoimmune diseases such as multiple sclerosis points to differences underlying female immune processes.

“Women have different immune systems inherently,” she says.

Because immune dysfunction and inflammation are intertwined, when part of the neuroimmune system malfunctions, it can lead to increased or chronic inflammation.

Cosgrove’s research into neuroimmune sex differences could help explain why women are prone to alcohol-related harms such as alcohol-related liver disease.

“These sex differences are likely driven by differences in inflammation,” says Cosgrove.

“And we can think of different medications and treatments to target and treat this inflammation.”

Cosgrove’s team is also working to identify other differences in the neuroimmune system that might be easier to therapeutically target.

“The target we have now, TSPO [a protein expressed by microglia], isn’t an easy target to make a medication for,” she says.

“But there are likely other targets that could give us new ideas for medication development.”

Meanwhile, Picciotto is testing hypotheses raised by studies like Cosgrove’s in mice. By feeding a blocker of microglia to female mice, for example, her team is studying how this alters their likelihood to choose alcohol over water when under stress.

So far, they have found that reducing microglia by half does not significantly impact stress-induced alcohol consumption.

In future studies, they plan to investigate whether a greater deficit of microglia will impact mice behavior.

Her team has shown, however, that reducing inflammation does alter alcohol-seeking behaviors.

The researchers blocked signaling pathways known to promote inflammatory responses using a drug called apremilast. In humans, this drug treats conditions such as psoriatic arthritis and plaque psoriasis by reducing inflammation.

They found that the drug reduced the likelihood of mice choosing alcohol over water.

In future studies, Picciotto hopes to identify specific subtypes of inflammatory responses that are important in the context of alcohol use disorder.

“We’re interested in whether different circuits are engaged in response to stress-induced alcohol drinking in male and female mice,” she says.

“And we’re going to continue to look at the consequences of these inflammatory responses on neural signaling.”

Envisioning a future of personalized therapies:

This January, former U.S. Surgeon General Vivek Murthy issued a report communicating to the public for the first time that alcohol is a significant cause of cancer—further highlighting the urgent need for effective therapies for alcohol use disorder.

The ongoing research at the Yale Program on Sex Differences in Alcohol Disorder is paving the way for a new era of more personalized therapeutics for women.

“We’re just at the beginning of really understanding what it is about the brain and body that differs between men and women who drink,” says Picciotto.

McKee hopes her program will help contribute to a healthier society overall. “Our goal is to improve the health of everyone,” she says. “We really need to be focused on a personalized medicine perspective—particularly in regard to addiction and alcohol.”

Women can develop alcohol use disorder no matter what stage of life they’re in. And there is no shame in seeking support, the researchers say.

“We think about adolescents as being particularly susceptible to problematic drinking—and they certainly are—but women can be at-risk of developing problematic drinking patterns across their lifespan,” says Picciotto. Research shows that alcohol use, for instance, is also on the rise among women ages 60 and older.

“Stressful life events may increase a woman’s alcohol intake in ways that are surprising to them, and there are options for them to get help if they need to decrease their drinking during stressful times.”

For resources on finding quality, evidence-based care, women can visit the National Institute on Alcohol Abuse and Alcoholism (NIAAA) Alcohol Treatment Navigator.

The Yale Program on Sex Differences in Alcohol Disorder is also enrolling for medication trials.



So far, they have found that reducing microglia by half does not significantly impact stress-induced alcohol consumption. In future studies, they plan to investigate whether a greater deficit of microglia will impact mice behavior. Credit: Neuroscience News
97
https://www.livescience.com/technology/artificial-intelligence/ai-is-just-as-overconfident-and-biased-as-humans-can-be-study-shows

AI is just as overconfident and biased as humans can be, study shows

May 4, 2025


* RgmiboScVHY3qs2MEqVeQA-970-80.png.webp (23.33 kB . 970x546 - viewed 198 times)
(Image credit: SEAN GLADWELL/Getty Images)

Irrational tendencies — including the hot hand, base-rate neglect and sunk cost fallacy — commonly show up in AI systems, calling into question how useful they actually are.

Although humans and artificial intelligence (AI) systems "think" very differently, new research has revealed that AIs sometimes make decisions as irrationally as we do.

In almost half of the scenarios examined in a new study, ChatGPT exhibited many of the most common human decision-making biases. Published April 8. in the journal Manufacturing & Service Operations Management, the findings are the first to evaluate ChatGPT's behavior across 18 well-known cognitive biases found in human psychology.

The paper's authors, from five academic institutions across Canada and Australia, tested OpenAI's GPT-3.5 and GPT-4 — the two large language models (LLMs) powering ChatGPT — and discovered that despite being "impressively consistent" in their reasoning, they're far from immune to human-like flaws.

What's more, such consistency itself has both positive and negative effects, the authors said.

"Managers will benefit most by using these tools for problems that have a clear, formulaic solution," study lead-author Yang Chen, assistant professor of operations management at the Ivey Business School, said in a statement. "But if you’re using them for subjective or preference-driven decisions, tread carefully."

The study took commonly known human biases, including risk aversion, overconfidence and the endowment effect (where we assign more value to things we own) and applied them to prompts given to ChatGPT to see if it would fall into the same traps as humans.

Rational decisions — sometimes:

The scientists asked the LLMs hypothetical questions taken from traditional psychology, and in the context of real-world commercial applicability, in areas like inventory management or supplier negotiations. The aim was to see not just whether AI would mimic human biases but whether it would still do so when asked questions from different business domains.

GPT-4 outperformed GPT-3.5 when answering problems with clear mathematical solutions, showing fewer mistakes in probability and logic-based scenarios. But in subjective simulations, such as whether to choose a risky option to realize a gain, the chatbot often mirrored the irrational preferences humans tend to show.

"GPT-4 shows a stronger preference for certainty than even humans do," the researchers wrote in the paper, referring to the tendency for AI to tend towards safer and more predictable outcomes when given ambiguous tasks.

More importantly, the chatbots' behaviors remained mostly stable whether the questions were framed as abstract psychological problems or operational business processes. The study concluded that the biases shown weren't just a product of memorized examples — but part of how AI reasons.

One of the surprising outcomes of the study was the way GPT-4 sometimes amplified human-like errors. "In the confirmation bias task, GPT-4 always gave biased responses," the authors wrote in the study. It also showed a more pronounced tendency for the hot-hand fallacy (the bias to expect patterns in randomness) than GPT 3.5.

Conversely, ChatGPT did manage to avoid some common human biases, including base-rate neglect (where we ignore statistical facts in favor of anecdotal or case-specific information) and the sunk-cost fallacy (where decision making is influenced by a cost that has already been sustained, allowing irrelevant information to cloud judgment).

According to the authors, ChatGPT’s human-like biases come from training data that contains the cognitive biases and heuristics humans exhibit. Those tendencies are reinforced during fine-tuning, especially when human feedback further favors plausible responses over rational ones. When they come up against more ambiguous tasks, AI skews towards human reasoning patterns more so than direct logic.

"If you want accurate, unbiased decision support, use GPT in areas where you'd already trust a calculator," Chen said. When the outcome depends more on subjective or strategic inputs, however, human oversight is more important, even if it's adjusting the user prompts to correct known biases.

"AI should be treated like an employee who makes important decisions — it needs oversight and ethical guidelines," co-author Meena Andiappan, an associate professor of human resources and management at McMaster University, Canada, said in the statement.

"Otherwise, we risk automating flawed thinking instead of improving it."
98
https://www.wired.com/story/grid-scale-battery-storage-is-quietly-revolutionizing-the-energy-system/?utm_source=nl&utm_brand=wired&utm_mailing=WIR_Backchannel_050125&utm_campaign=aud-dev&utm_medium=email&utm_content=WIR_Backchannel_050125&bxid=67883001cdeb6340250c3d97&cndid=85787720&hasha=c9edd795ab58c731e64cc2832451a46d&hashb=92cd5a4e4f9a554757364e6cc6a52d8ff33f14ec&hashc=1e7f7a9239bb44f191dc979b8fe5e634e587dfe020b84a653d2040468a8b342b&esrc=bx_multi2nd_science&utm_term=WIR_Backchannel

Grid-Scale Battery Storage Is Quietly Revolutionizing the Energy System

Apr 26, 2025

This energy storage technology is harnessing the potential of solar and wind power—and its deployment is growing exponentially.

The tricky thing about generating electricity is that for the most part, you pretty much have to use it or lose it.

This fundamental fact has governed and constrained the development of the world’s largest machine: the $2 trillion US power grid. Massive generators send electrons along a continent-wide network of conductors, transformers, cables, and wires into millions of homes and businesses, delicately balancing supply and demand so that every light switch, computer, television, stove, and charging cable will turn on 99.95 percent of the time.

Making sure there are always enough generators spooled up to send electricity to every single power outlet in the country requires precise coordination. And while the amount of electricity actually used can swing drastically throughout the day and year, the grid is built to meet the brief periods of peak demand, like the hot summer days when air conditioning use can double average electricity consumption. Imagine building a 30-lane highway to make sure no driver ever has to tap their brakes. That’s effectively what those who design and run the grid have had to do.

But what if you could just hold onto electricity for a bit and save it for later? You wouldn’t have to overbuild the grid or spend so much effort keeping power generation in equilibrium with users. You could smooth over the drawbacks of intermittent power sources that don’t emit carbon dioxide, like wind and solar. You could have easy local backup power in emergencies when transmission lines are damaged. You may not even need a giant, centralized power grid at all.

That’s the promise of grid-scale energy storage. And while the US has actually been using a crude form of energy storage called pumped hydroelectric power storage for decades, the country is now experiencing a gargantuan surge in energy storage capacity, this time from a technology that most of us are carrying around in our pockets: lithium-ion batteries. Between 2021 and 2024, grid battery capacity increased fivefold. In 2024, the US installed 12.3 gigawatts of energy storage. This year, new grid battery installations are on track to almost double compared to last year. Battery storage capacity now exceeds pumped hydro capacity, totaling more than 26 gigawatts.

There’s still plenty of room to expand—and a pressing need to do so. The power sector remains the second-largest source of greenhouse gas emissions in the US, and there will be no way to add enough intermittent clean energy to sufficiently decarbonize the grid without cheap and plentiful storage.

The aging US grid is also in dire need of upgrades, and batteries can cushion the shock of adding gigawatts of wind and solar while buying some time to perform more extensive renovations. Some power markets are finally starting to understand all the services batteries can provide—frequency regulation, peak shaving, demand response—creating new lines of business. Batteries are also a key tool in building smaller, localized versions of the power grid. These microgrids can power remote communities with reliable power and one day shift the entire power grid into a more decentralized system that can better withstand disruptions like extreme weather.

If we can get it right, true grid-scale battery storage won’t just be an enabler of clean energy, but a way to upgrade the power system for a new era.

How Big Batteries Got so Big:

Back in 2011, one of my first reporting assignments was heading to a wind farm in West Virginia to attend the inauguration of what was at the time the world’s largest battery energy storage system. Built by AES Energy Storage, it involved thousands of lithium-ion cells in storage containers that together combined to provide 32 megawatts of power and deliver it for about 15 minutes.

“It was eight megawatt-hours total,” said John Zahurancik, who was vice president of AES Energy Storage at the time and showed me around the facility back then. That was about the amount of electricity used by 260 homes in a day.

In the years since, battery storage has increased by orders of magnitude, as Zahurancik’s new job demonstrates. He is now the president of Fluence, a joint venture between AES and Siemens that has deployed 38 gigawatt-hours of storage to date around the world. “The things that we’re building today, many of our projects are over a gigawatt-hour in size,” Zahurancik said.

Last year, the largest storage facility to come online in the US was California’s Edwards & Sanborn Project, which can hold 3.3 gigawatt-hours. That’s roughly equivalent to the electricity needed to power 110,000 homes for a day.

It wasn’t a steady climb to this point, however. Overall grid battery capacity in the US barely budged for more than a decade. Then, around 2020, it began to spike upward. What changed?

One shift is that the most common battery storage technology, lithium-ion cells, saw huge price drops and energy density increases. “The very first project we did was in 2008 and it was on the order of $3,000 a kilowatt-hour for the price of the batteries,” said Zahurancik. “Now we’re looking at systems that are on the order of $150, $200 a kilowatt-hour for the full system install.”

That’s partly because the cells on the power grid aren’t that different from those in mobile devices and electric vehicles, so grid batteries have benefited from manufacturing improvements that went into those products.

“It’s all one big pipeline,” said Micah Ziegler, a professor at Georgia Tech who studies clean energy technologies. “The batteries in phones, cars, and the grid all share common characteristics.” Seeing this rising demand, China went big on battery manufacturing and, much as it did in solar panels, created economies of scale to drive global prices down. China now produces 80 percent of the world’s lithium-ion batteries.

The blooming of wind and solar energy created even more demand for batteries and increased the pressure to improve them. The wind and the sun are often the cheapest sources of new electricity, and batteries help compensate for their variability, providing even more reason to scale up storage. “The benefits of this relationship are apparent in the increasing number of power plants that are being proposed and that have already been deployed that combine these resources,” Ziegler said. The combination of solar plus storage accounted for 84 percent of new US power added in 2024.

And because grid batteries don’t have to be small enough to be mobile—unlike the batteries in your laptop or phone—they can take advantage of cheaper, less dense batteries that otherwise might not be suited for something that has to fit in your pocket. There’s even talk of giving old EV batteries a second life on the power grid.

Regulation has also helped. A major hurdle for deploying grid energy storage systems is that they don’t generate electricity on their own, so the rules for how they should connect to the grid and how much battery developers should get paid for their services were messy and restrictive in the past. The Federal Energy Regulatory Commission’s Order 841 removed some of the barriers for energy storage systems to plug into wholesale markets and compete with other forms of power. Though the regulation was issued in 2018, it cleared a major legal challenge in 2020, paving the way for more batteries to plug into the grid.

Eleven states to date including California, Illinois, and Maryland have also set specific procurement targets for energy storage, which require utilities to install a certain amount of storage capacity, creating a push for more grid batteries. Together, these factors created a whole new businesses for power companies, spawned new grid battery companies, and fertilized the ground for a bumper crop of energy storage.

What Can Energy Storage Do for You?

Energy storage is the peanut butter to the chocolate of renewable energy, making all the best traits about clean energy even better and balancing out some of its downsides. But it’s also an important ingredient in grid stability, reliability, and resilience, helping ensure a steady flow of megawatts during blackouts and extreme weather.

The most common use is frequency response. The alternating current going through power lines in the US cycles at a frequency of 60 hertz. If the grid dips below this frequency when a power-hungry user switches on, it can trip circuit breakers and cause power instability. Since batteries have nearly zero startup time, unlike thermal generators, they can quickly absorb or transmit power as needed to keep the grid humming the right tune.

Grid batteries can also step in as reserve power when a generator goes offline or when a large power user unexpectedly turns on. They can smooth out the hills and valleys of power load over the course of the day. They also let power providers save electricity when it’s cheap to produce, and sell it back on the grid at times when demand is high and power is expensive. It’s often faster to build a battery facility than an equivalent power plant, and since there are no smokestacks, it’s easier to get permits and approvals.

Batteries have already proven useful for overstressed power networks. As temperatures reached triple digits in Texas last year, batteries provided a record amount of power on the Lone Star State’s grid. ERCOT, the Texas grid operator, didn’t have to ask Texans to turn down their power use like it did in 2023. Between 2020 and 2024, Texas saw a 4,100 percent increase in utility-scale batteries, topping 5.7 gigawatts.


* GettyImages-2167569505.webp (143.28 kB . 1024x731 - viewed 102 times)

Jupiter Power battery storage complex in Houston in 2024. Photograph: Jason Fochtman/Houston Chronicle via Getty Images

Grid batteries have a halo effect for other power generators too. Most thermal power plants—coal, gas, nuclear—prefer to run at a steady pace. Ramping up and down to match demand takes time and costs money, but with batteries soaking up some of the variability, thermal power plants can stay closer to their most efficient pace, reducing greenhouse gas emissions and keeping costs in check.

“It’s kind of like hybridizing your car,” Zahurancik said. “If you think about a Prius, you have an electric motor and you have a gasoline motor and you make the gas consumption better because the battery absorbs all the variation.”

Another grid battery feature is that they can reduce the need for expensive grid upgrades, said Stephanie Smith, chief operating officer at Eolian, which funds and develops grid energy storage systems. You don’t have to build power lines to accommodate absolute maximum electricity needs if you have a battery—on the generator side or on the demand side—to dish out a few more electrons when needed.

“What we do with stand-alone batteries, the more and more of those you get, you start to alleviate needs or at least abridge things like new transmission build,” Smith said. These batteries also allow the grid to adapt faster to changing energy needs, like when a factory shuts down or when a new data center powers up.

On balance this leads to a more stable, efficient, cheaper, and cleaner power grid.

Charging Up:

As good as they are, lithium-ion batteries have their limits. Most grid batteries are designed to store and dispatch electricity over the course of two to eight hours, but the grid also needs ways to stash power for days, weeks, and even months since power demand shifts throughout the year.

There are also some fundamental looming challenges for grid-scale storage. Like most grid-level technologies, energy storage requires a big upfront investment that takes decades to pay back, but there’s a lot of uncertainty right now about how the Trump administration’s tariffs will affect battery imports, whether there will be a recession, and if this disruption will slow electricity demand growth in the years to come. The extraordinary appetite for batteries is increasing competition for the required raw materials, which may increase their prices.

Though China currently dominates the global battery supply chain, the US is working to edge its way in. Under the previous administration, the US Department of Energy invested billions in energy storage factories, supply chains, and research. There are dozens of battery factories in the US now, though most are aimed at electric vehicles. There are 10 US factories slated to start up this year, which would raise the total EV battery manufacturing capacity to 421.5 gigawatt-hours per year. Total global battery manufacturing is projected to reach around 7,900 gigawatt-hours in 2025.


* GettyImages-1232437812.webp (108.08 kB . 1024x682 - viewed 110 times)

Lithium battery modules inside the battery building at the Vistra Corp. Moss Landing Energy Storage Facility in Moss Landing, California, in 2021. Photograph: David Paul Morris/Bloomberg via Getty Images

There’s also a long and growing line of projects waiting to connect to the power grid. Interconnection queues for all energy systems, but particularly solar, wind, and batteries, typically last three years or more as project developers produce reliability studies and cope with mounting regulatory paperwork delays.

The Trump administration is also working to undo incentives around clean energy, particularly the 2022 Inflation Reduction Act. The law established robust incentives for clean energy, including tax credits for stand-alone grid energy projects. “I do worry about the IRA because it will change the curve, and quite honestly we cannot afford to change the curve right now with any form of clean energy,” Smith said. On the other hand, Trump’s tariffs may eventually spur even more battery manufacturing within the US.

Still, utility-scale energy storage is a tiny slice of the sprawling US power grid, and there’s enormous room to expand. “Even though we’ve been accelerating and going fast, by and large, we don’t have that much of it,” Zahurancik said. “You could easily see storage becoming 20 or 30 percent of the installed power capacity.”



Photograph: Bloomberg/Getty Images

Power transmission towers outside the Crimson Battery Energy Storage Project in Blythe, California. Photograph: Bing Guan/Bloomberg via Getty Images

The Los Angeles Department of Water and Power’s biggest solar and battery storage plant, the Eland Solar and Storage Center in the Mojave Desert. Photograph: Brian van der Brug/Los Angeles Times via Getty Images

Battery solar energy storage units, right, at the Eland Solar and Storage Center in 2024. Phtogoraph: Brian van der Brug/Los Angeles Times via Getty Images
100
-> https://github.com/aegersz/Legal-buzz/blob/main

If you don't have a GitHub account then make one and follow me -- there is a Private section and if you want access then i'll make you a collaborator
Pages: 1 ... 78910

dopetalk does not endorse any advertised product nor does it accept any liability for it's use or misuse





TERMS AND CONDITIONS

In no event will d&u or any person involved in creating, producing, or distributing site information be liable for any direct, indirect, incidental, punitive, special or consequential damages arising out of the use of or inability to use d&u. You agree to indemnify and hold harmless d&u, its domain founders, sponsors, maintainers, server administrators, volunteers and contributors from and against all liability, claims, damages, costs and expenses, including legal fees, that arise directly or indirectly from the use of any part of the d&u site.


TO USE THIS WEBSITE YOU MUST AGREE TO THE TERMS AND CONDITIONS ABOVE


Founded December 2014
SimplePortal 2.3.6 © 2008-2014, SimplePortal