Saturday, September 28, 2024

hydrogel-based brain sensors for enhanced adhesion

Innovative Brain Sensor Enhances Transcranial Focused Ultrasound for Neurological Disorders

Introduction to Transcranial Focused Ultrasound

Our brain sensor adheres strongly to the surface of brain tissue

This non-invasive technique, known as transcranial focused ultrasound, employs high-frequency sound waves to stimulate targeted brain regions, offering a potential breakthrough in treating neurological disorders like drug-resistant epilepsy and recurrent tremors.

Development of the Innovative Sensor

Researchers at Sungkyunkwan University (SKKU), IBS, and the Korea Institute of Science and Technology have designed an innovative sensor for transcranial focused ultrasound. Their study, featured in Nature Electronics, describes a flexible sensor that conforms to cortical surfaces, facilitating neural signal detection and low-intensity ultrasound-based brain stimulation.

Challenges with Previous Brain Sensors

"Previous efforts to develop brain sensors struggled to achieve precise signal measurement because they couldn't fully adapt to the brain's complex folds," remarked Donghee Son, the supervising author of the study, in an interview with Tech Xplore.

"The inability to precisely analyze the entire brain surface limited accurate diagnosis of brain lesions. Despite the innovative ultra-thin brain sensor developed by Professors John A. Rogers and Dae-Hyeong Kim, it encountered difficulties in tightly adhering to areas with severe curvature."

Limitations of Existing Sensors

The brain sensor created by Professors Rogers and Kim demonstrated improved precision in collecting surface-level measurements. However, it exhibited notable limitations, including difficulty adhering to areas with significant curvature and a tendency to shift from its attachment point due to micro-movements and cerebral spinal fluid flow.

The limitations observed reduce the sensor's suitability for clinical use, as they hinder its ability to capture brain signals reliably in specific areas over longer duration's.

The New Sensor Design

To overcome these challenges, Son and colleagues developed a new sensor designed for better adhesion to curved brain surfaces, ensuring stable, long-term data collection.

"The sensor we engineered is capable of conforming to even the most curved brain regions, ensuring a firm attachment to brain tissue," said Son. "This strong bond allows for long-term, precise measurement of brain signals from specific areas."

ECoG Sensor Features

The ECoG sensor designed by Son and his team attaches firmly to brain tissue, ensuring no voids are created. This feature markedly decreases noise from external mechanical movements.

"This feature plays a crucial role in improving the efficacy of epilepsy treatment using low-intensity focused ultrasound (LIFU)," noted Son. "Although ultrasound is recognized for its ability to reduce epileptic activity, the variability in patient conditions and individual differences present significant obstacles in customizing treatments."

Personalized Ultrasound Stimulation Therapies

Recently, numerous research teams have been focused on developing personalized ultrasound stimulation therapies for epilepsy and various neurological disorders. To tailor these treatments to the specific needs of each patient, it is essential to measure their brain waves in real-time while simultaneously stimulating targeted brain regions.

Our brain sensor (SMCA) begins to form a strong bond

"Traditional sensors attached to the brain surface faced challenges in this regard, as the vibrations induced by ultrasound generated considerable noise, hindering real-time monitoring of brain waves," stated Son.

"This limitation significantly hindered the development of personalized treatment strategies. Our sensor substantially minimizes noise, facilitating effective epilepsy treatment through tailored ultrasound stimulation."

Structure of the Shape-Morphing Sensor

Son and his colleagues developed a shape-morphing brain sensor with three primary layers. These consist of a hydrogel-based layer for both physical and chemical bonding with tissue, a self-healing polymer layer that adjusts its form to fit the surface beneath, and a thin, stretchable layer containing gold electrodes and interconnects.

Son noted that when the sensor is positioned on the brain surface, the hydrogel layer activates a gelation process that establishes a strong and instant bond with the brain tissue.

"Subsequently, the self-healing polymer substrate starts to deform, adapting to the curvature of the brain, which enhances the contact area between the sensor and the tissue over time. Once the sensor has completely conformed to the brain's contours, it is primed for operation."

Advantages of the New Sensor

The sensor created by this research team offers multiple advantages compared to other brain sensors developed in recent years. Notably, it can securely attach to brain tissue while adapting its shape to conform tightly to surfaces, regardless of their curvature.

By conforming to the contours of curved surfaces, the sensor effectively reduces vibrations generated by external ultrasound stimulation. This capability enables physicians to accurately measure brain wave activity in patients, both in standard conditions and during ultrasound procedures.

Future Applications

According to Son, we foresee this technology being applicable not only for epilepsy management but also for the diagnosis and treatment of multiple brain disorders. The most crucial aspect of our research is the synergy between tissue-adhesive technology, which enables robust adhesion to brain tissue, and shape-morphing technology, allowing the sensor to conform precisely to the brain's surface without leaving any gaps.

Testing and Future Development

To date, the novel sensor engineered by Son and his team has undergone testing on conscious, living rodents. The results obtained were exceptionally promising, demonstrating the team's ability to accurately measure brain waves and mange seizures in these animals.

The researchers aim to expand the sensor's capabilities by developing a high-density array based on their initial design. Upon successful completion of clinical trials, this enhanced sensor could be utilized to diagnose and treat epilepsy and other neurological disorders, potentially adancing the effectiveness of prosthetic technologies.

With 16 electrode channels currently integrated into our brain sensor, Son highlighted an area ripe for improvement concerning high-resolution mapping of brain signals.

"Taking this into consideration, our strategy involves significantly augmenting the number of electrodes to enable comprehensive and high-resolution brain signal analysis. We also aspire to devise a minimally invasive implantation technique for the brain sensor on the surface of the brain, aiming for its application in clinical research."

Source

Labels: , , , , ,

Tuesday, July 23, 2024

AI technology for wildfire prevention

 AI technology for wildfire prevention

USC researchers have introduced an innovative method for predicting wildfire spread with high accuracy. Their model integrates satellite imagery and AI, potentially revolutionizing wildfire management and emergency response protocols.

As detailed in an initial study published in Artificial Intelligence for the Earth Systems, the USC model utilizes statellite imagery to track the real-time progression of wildfires and processes this information through a sophisticated algorithm to accurately forecast the fire's potential path, intensity, and expansion rate.

This research emerges as California and the broader western United States contend with an intensifying wildfire season. Numerous fires, driven by a perilous mix of wind, drought, and extreme heat, are currently blazing across the region. Notably, the Lake Fire, the largest wildfire in California this year, has already consumed over 38,000 acres in Santa Barbara County.

According to Bryan Shaddy, a doctoral candidate in the Department of Aerospace and Mechanical Engineering at the USC Viterbi School of Engineering and study's lead author, 'This model marks a significant advancement in our capability to combat wildfires. By providing more accurate and timely data, our tool enhances the efforts of firefighters and evacuation teams on the front lines.'

AI-Powered Approach to Understanding Wildfire Behavior

The research team initiated their study by collecting historical wildfire data from high-resolution satellite imagery. Through meticulous examination of past wildfire behavior, they traced the ignition, spread, and containment of each fire. Their thorough analysis uncovered patterns shaped by various factors such as weather conditions, fuel types (e.g., trees, brush), and terrain characteristics.

Subsequently, the researchers trained a generative AI model, specifically a conditional Wasserstein Generative Adversarial Network (cWGAN), to simulate the impact of various factors on wildfire progression over time. This model was programmed to identify patterns in satellite imagery that correspond with the wildfire spread observed in their simulations.

Subsequently, they validted the cWGAN model by applying it to actual wildfire events that transpired in California form 2020 to 2022, assessing its predictive accuracy regarding fire spread.

"Analyzing historical fire behavior allows us to develop a predictive model for forecasting the spread of future wildfires," stated Assad Oberai, Hughes Professor and Professor of Aerospace and Mechanical Engineering at USC Viterbi, and co-author of the study.

AI-Powered Wildfire Forecasting: An Exemplary Predictive Model

Oberai and Shaddy expressed their satisfaction with the cWGAN model, which, despite being initially trained on basic simulated data with ideal conditions such as flat terrain and uniform wind, demonstrated strong performance when tested on actual California wildfires. They credit this success to the integration fo real wildfire data from satellite imagery, rather than relying solely on simulated scenarios.

Oberai, known for his expertise in developing computer models to elucidate complex physical phenomena, has tackled a diverse range of subjects including turbulent airflow around aircraft wings, infectious disease dynamics, and cellular interactions in tumors. Among these, he highlights wildfires as one of the most intricate challenges he has faces.

Wildfires encompass complex processes where ignition of fuels such as grass, shrubs, or trees initiates intricate chemical reactions that produce heat and wind currents. Additionally, topography and weather conditions significantly affect fire dynamics- fires tend to spread slowly in moist environments but can accelerate rapidly under dry conditions. These phenomena are characterized by their complexity, chaos, and nonlinearity, requiring sophisticated computational models to accurately capture all influencing factors.

The research team also includes Valentina Calaza, an undergraduate student in the Department of Aerospace and Mechanical Engineering at USC Viterbi; Deep Ray from the university of Maryland, College Park, who was previously a postdoctoral researcher at USC Viterbi; Angel Farguell and Adam Kochanski form San Jose State University; Jan Mandel of the University of Colorado, Denver; James Haley and Kyle Hilburn from Colorado State University, Fort Collins; and Derek Mallia from the University of Utah.

Further Details: Shaddy, B., et al. (2024). Generative Algorithms for Integrating Physics-Based Wildfire Spread Models with Satellite Data to Enhance Wildfire Forecasting. Artificial Intelligence for the Earth Systems. DOI: 10.1175/AIES-D-23-0087.1

Source 

Labels: ,

Tuesday, July 16, 2024

Benefits of neuromorphic chips for AI training efficiency

Neural Network Training simplified by intelligent hardware solutions

Overview of Neuromorphic Chip Technology

AI and Neuromorphic Chips

Large-scale neural network models underpin numerous AI technologies, including neuromorphic chips inspired by the human brain. Training these networks can be ardous, time-consuming, and energy-inefficient, as the model is typically trained on a computer before being transferred to the chip. This process restricts the application and efficiency of neuromorphic chips.

On-Chip Training Innovation

TU/e researchers have developed a neuromorphic device that facilitates on-chip training, removing the requirement to transfer trained models to the chip. This breakthrough could lead to the creation of efficient and dedicated AI chips.

Inspiration and Teamwork Behind the Technology

Mimicking the Human Brain

"Contemplate the marvel that is the human brain--a remarkable computational marvel known for its speed, dynamism, adaptability, and exceptional energy efficiency."

"Researchers at TU/e, led by Yoeri Van de Burgt, have been inspired by these attributes to replicate neural processes in technologies critical for learning, including AI applications in transport, communication, and healthcare."

Neural Networks and Challenges

According to Van de Burgt, an associate professor in the Department of Mechanical Engineering at TU/e, a neural network is typically central to these AI system.

Neural networks are computational models inspired by the brain, in the human brain, neurons communicate through synapses, and the more frequently two neurons interact, the stronger their connection becomes, Similarly, in neural network models composed of nodes, the strength of the connection between any two nodes is quantified by a value known as the weight.

Van de Burgt explains that while neural networks can tackle intricate problems with substantial data, their expansion leads to increased energy demands and hardware limitations. Nevertheless, a promising alternative in hardware--neuromorphic chips---offers a solution.

The Neuromorphic Issue

Neuromorphic chips, much like neural networks, are based on brain fuctions, but they achieve a higher degree of imitation. In the brain, neurons fire and send electical charges to other neurons when their electrical charge changes. This process is mirrored by neuromorphic chips.

"In neuromorphic chips, there are memristors--memory resistors--that can retain information about the electrical charge that has flowed through them, Van de Burgt states. This functionality is crucial for devices designed to emulate the information storage and communication of brain neurons."

However, a neuromorphic challenge exists, and it pertains to the two methods, used for training hardware based on neuromorphic chips. The first method involves conducting the training on a computer,  with the resultant network weights being transferred to the chip hardware. The alternative approach is to conduct the trainign in-situ, directly within the hardware. However, current devices must be individually programmed and subsequently error-checke. This necessity arises because most memeristors exhibit stochastie behavior, making it impossible to update the device without verification.

These strategies are costly regarding time, energy, and computing resources, Van de Burgt indicates. To achieve the full energy efficiency potential of neuromorphic chips, training should be directly conducted on them.

The TU/e Solution

This is precisely what Van de Burgt and his collaborators at TU/e have accomplished, as detailed in their recent publication in Science Advances. "This was a true team effort, spearheaded by co-first authors Tim Stevens and Eveline van Doremaele," Van de Burgt notes.

This research began with Tim Stevens' master's journey. 'I became fascinated by this subject during my master's research. Our work shows that training can be executed directly on hardware without needing to transfer a trained model to the chip. This advancement could lead to more efficient AI chips,' says Stevens.

Van de Burgt, Stevens, and Van Doremaele--who completed her Ph.D. thesis on neuromorphic chips in 2023--required assistance with the hardware design. Consequently, they sought the expertise of Marco Fattore from the Department of Electrical Engineering.

"My team assisted with the circuit design elements of the chip, "Fattori states. "It was fantastic to collaborate on this interdisciplinary project, bridging the gap between chip developers and software engineers."

Van de Burgt observed form this project that valuable ideas can arise form any academic stage. "Tim identified the broader potential of our device properties during his master's studies. There's an important lesson here for all projects."

Dual-Layer Training

The researchers faced a central challenge:integrating critical components necessary for on-chip training onto a single neuromorphic chip. "One of the major tasks was integrating components such as electrochemical random-access memory (EC-RAM)," explains Van de Burgt. "These components replicate the electrical charge storage and firing characterstic of neurons."

The researchers developed a two-layer neural network utilizing EC-RAM components fabricated from organic materials. They evaluated the hardware using an iteration of the widely adopted training algorithm, back propagation with gradient descent. "The conventional algorithm is commonly used to refine neural network accuracy, but it's not suited for our hardware, leading us to create our own variant," states Stevens.

In addition, as AI applications in many domains become increasingly unsustainable in their energy consumption, the prospect of training neural networks on hardware with minimal energy expenditure becomes an attractive option across various applications--from ChatGPT to weather forecasting.

The Subsequent phase

While the researchers have shown the efficacy of the new training approach, the next logical move is to scale up, innovate further, and enhance.

"We have demonstrated the feasibility with a small two-layer network," states Van de Burgt. "Our next objective is to collaborate with industry and major research institutions to scale up to larger networks of hardware devices and validate them with real-world data challenges."

This progression would enable the researchers to showcase the high efficiency of these systems in both training and operationalizing beneficial neural networks and AI system. "We aim to deploy this technology across various practical scenarious," remarks Van de Burgt. "My vision is for such technologies to set a standard in future AI applications."

More information - The article 'Hardware implementation of backpropagation using progressive gradient descent for in situ training of multilayer neural network' by Eveline R. W. van Doremaele et al. appears in Science Advances (2024). DOI: 10.1126/sciadv.ado8999


Labels: , ,