Memory 12 Larry Minikes Memory 12 Larry Minikes

Model predicts cognitive decline due to Alzheimer's, up to two years out

August 2, 2019

Science Daily/Massachusetts Institute of Technology

A new model developed at MIT can help predict if patients at risk for Alzheimer's disease will experience clinically significant cognitive decline due to the disease, by predicting their cognition test scores up to two years in the future.

 

The model could be used to improve the selection of candidate drugs and participant cohorts for clinical trials, which have been notoriously unsuccessful thus far. It would also let patients know they may experience rapid cognitive decline in the coming months and years, so they and their loved ones can prepare.

 

Pharmaceutical firms over the past two decades have injected hundreds of billions of dollars into Alzheimer's research. Yet the field has been plagued with failure: Between 1998 and 2017, there were 146 unsuccessful attempts to develop drugs to treat or prevent the disease, according to a 2018 report from the Pharmaceutical Research and Manufacturers of America. In that time, only four new medicines were approved, and only to treat symptoms. More than 90 drug candidates are currently in development.

 

Studies suggest greater success in bringing drugs to market could come down to recruiting candidates who are in the disease's early stages, before symptoms are evident, which is when treatment is most effective. In a paper to be presented next week at the Machine Learning for Health Care conference, MIT Media Lab researchers describe a machine-learning model that can help clinicians zero in on that specific cohort of participants.

 

They first trained a "population" model on an entire dataset that included clinically significant cognitive test scores and other biometric data from Alzheimer's patients, and also healthy individuals, collected between biannual doctor's visits. From the data, the model learns patterns that can help predict how the patients will score on cognitive tests taken between visits. In new participants, a second model, personalized for each patient, continuously updates score predictions based on newly recorded data, such as information collected during the most recent visits.

 

Experiments indicate accurate predictions can be made looking ahead six, 12, 18, and 24 months. Clinicians could thus use the model to help select at-risk participants for clinical trials, who are likely to demonstrate rapid cognitive decline, possibly even before other clinical symptoms emerge. Treating such patients early on may help clinicians better track which antidementia medicines are and aren't working.

 

"Accurate prediction of cognitive decline from six to 24 months is critical to designing clinical trials," says Oggi Rudovic, a Media Lab researcher. "Being able to accurately predict future cognitive changes can reduce the number of visits the participant has to make, which can be expensive and time-consuming. Apart from helping develop a useful drug, the goal is to help reduce the costs of clinical trials to make them more affordable and done on larger scales."

 

Joining Rudovic on the paper are: Yuria Utsumi, an undergraduate student, and Kelly Peterson, a graduate student, both in the Department of Electrical Engineering and Computer Science; Ricardo Guerrero and Daniel Rueckert, both of Imperial College London; and Rosalind Picard, a professor of media arts and sciences and director of affective computing research in the Media Lab.

 

Population to personalization

For their work, the researchers leveraged the world's largest Alzheimer's disease clinical trial dataset, called Alzheimer's Disease Neuroimaging Initiative (ADNI). The dataset contains data from around 1,700 participants, with and without Alzheimer's, recorded during semiannual doctor's visits over 10 years.

 

Data includes their AD Assessment Scale-cognition sub-scale (ADAS-Cog13) scores, the most widely used cognitive metric for clinical trials of Alzheimer's disease drugs. The test assesses memory, language, and orientation on a scale of increasing severity up to 85 points. The dataset also includes MRI scans, demographic and genetic information, and cerebrospinal fluid measurements.

 

In all, the researchers trained and tested their model on a sub-cohort of 100 participants, who made more than 10 visits and had less than 85 percent missing data, each with more than 600 computable features. Of those participants, 48 were diagnosed with Alzheimer's disease. But data are sparse, with different combinations of features missing for most of the participants.

 

To tackle that, the researchers used the data to train a population model powered by a "nonparametric" probability framework, called Gaussian Processes (GPs), which has flexible parameters to fit various probability distributions and to process uncertainties in data. This technique measures similarities between variables, such as patient data points, to predict a value for an unseen data point -- such as a cognitive score. The output also contains an estimate for how certain it is about the prediction. The model works robustly even when analyzing datasets with missing values or lots of noise from different data-collecting formats.

 

But, in evaluating the model on new patients from a held-out portion of participants, the researchers found the model's predictions weren't as accurate as they could be. So, they personalized the population model for each new patient. The system would then progressively fill in data gaps with each new patient visit and update the ADAS-Cog13 score prediction accordingly, by continuously updating the previously unknown distributions of the GPs. After about four visits, the personalized models significantly reduced the error rate in predictions. It also outperformed various traditional machine-learning approaches used for clinical data.

 

Learning how to learn

But the researchers found the personalized models' results were still suboptimal. To fix that, they invented a novel "metalearning" scheme that learns to automatically choose which type of model, population or personalized, works best for any given participant at any given time, depending on the data being analyzed. Metalearning has been used before for computer vision and machine translation tasks to learn new skills or adapt to new environments rapidly with a few training examples. But this is the first time it's been applied to tracking cognitive decline of Alzheimer's patients, where limited data is a main challenge, Rudovic says.

 

The scheme essentially simulates how the different models perform on a given task -- such as predicting an ADAS-Cog13 score -- and learns the best fit. During each visit of a new patient, the scheme assigns the appropriate model, based on the previous data. With patients with noisy, sparse data during early visits, for instance, population models make more accurate predictions. When patients start with more data or collect more through subsequent visits, however, personalized models perform better.

 

This helped reduce the error rate for predictions by a further 50 percent. "We couldn't find a single model or fixed combination of models that could give us the best prediction," Rudovic says. "So, we wanted to learn how to learn with this metalearning scheme. It's like a model on top of a model that acts as a selector, trained using metaknowledge to decide which model is better to deploy."

 

Next, the researchers are hoping to partner with pharmaceutical firms to implement the model into real-world Alzheimer's clinical trials. Rudovic says the model can also be generalized to predict various metrics for Alzheimer's and other diseases.

https://www.sciencedaily.com/releases/2019/08/190802110036.htm

Read More
Memory 12 Larry Minikes Memory 12 Larry Minikes

Socially active 60-year-olds face lower dementia risk

August 2, 2019

Science Daily/University College London

Being more socially active in your 50s and 60s predicts a lower risk of developing dementia later on, finds a new UCL-led study.

 

The longitudinal study, published in PLOS Medicine, reports the most robust evidence to date that social contact earlier in life could play an important role in staving off dementia.

 

"Dementia is a major global health challenge, with one million people expected to have dementia in the UK by 2021, but we also know that one in three cases are potentially preventable," said the study's lead author, Dr Andrew Sommerlad (UCL Psychiatry).

 

"Here we've found that social contact, in middle age and late life, appears to lower the risk of dementia. This finding could feed into strategies to reduce everyone's risk of developing dementia, adding yet another reason to promote connected communities and find ways to reduce isolation and loneliness."

 

The research team used data from the Whitehall II study, tracking 10,228 participants who had been asked on six occasions between 1985 and 2013 about their frequency of social contact with friends and relatives. The same participants also completed cognitive testing from 1997 onwards, and researchers referred to the study subjects' electronic health records up until 2017 to see if they were ever diagnosed with dementia.

 

For the analysis, the research team focused on the relationships between social contact at age 50, 60 and 70, and subsequent incidence of dementia, and whether social contact was linked to cognitive decline, after accounting for other factors such as education, employment, marital status and socioeconomic status.

 

The researchers found that increased social contact at age 60 is associated with a significantly lower risk of developing dementia later in life. The analysis showed that someone who saw friends almost daily at age 60 was 12% less likely to develop dementia than someone who only saw one or two friends every few months.

 

They found similarly strong associations between social contact at ages 50 and 70 and subsequent dementia; while those associations did not reach statistical significance, the researchers say that social contact at any age may well have a similar impact on reducing dementia risk.

 

Social contact in mid to late life was similarly correlated with general cognitive measures.

 

Previous studies have found a link between social contact and dementia risk, but they did not have such long follow-up times, so they could not rule out the possibility that the beginnings of cognitive decline may have been causing people to see fewer people, rather than the other way around. The long follow-up in the present study strengthens the evidence that social engagement could protect people from dementia in the long run.

 

The researchers say there are a few explanations for how social contact could reduce dementia risk.

 

"People who are socially engaged are exercising cognitive skills such as memory and language, which may help them to develop cognitive reserve -- while it may not stop their brains from changing, cognitive reserve could help people cope better with the effects of age and delay any symptoms of dementia," said senior author Professor Gill Livingston (UCL Psychiatry).

 

"Spending more time with friends could also be good for mental wellbeing, and may correlate with being physically active, both of which can also reduce the risk of developing dementia," added Professor Livingston, who previously led a major international study outlining the lifecourse factors that affect dementia risk.

 

The researchers were supported by Wellcome and the National Institute for Health Research UCLH Biomedical Research Centre, while the Whitehall II study is supported by the US National Institutes of Health, UK Medical Research Council and the British Heart Foundation.

 

The study was conducted by researchers in UCL Psychiatry, UCL Epidemiology & Public Health, Camden & Islington NHS Foundation Trust and Inserm.

 

Dr Kalpa Kharicha, Head of Innovation, Policy and Research at the Campaign to End Loneliness, said: "We welcome these findings that show the benefits of frequent social contact in late/middle age on dementia risk. As we found in our Be More Us Campaign, almost half of UK adults say that their busy lives stop them from connecting with other people. It's important we make changes to our daily lives to ensure we take the time to connect with others. We need more awareness of the benefits that social wellbeing and connectedness can have to tackle social isolation, loneliness and reduce dementia risk."

 

Fiona Carragher, Chief Policy and Research Officer at Alzheimer's Society, said: "There are many factors to consider before we can confirm for definite whether social isolation is a risk factor or an early sign of the condition -- but this study is a step in the right direction. We are proud of supporting work which helps us understand the condition better -- it is only through research that we can understand true causes of dementia and how best to prevent it.

 

"As the number of people in the UK with dementia is set to rise to one million by 2021, we must do what we can to reduce our risk -- so along with reducing your alcohol intake and stopping smoking, we encourage people across the country to get out into the sunshine, and do something active with family and friends.

 

"The Government's recent emphasis on health prevention is a welcome opportunity to reduce the risk of dementia across society. We now need to see Ministers prioritise better support initiatives to help people reduce the risk of dementia, and look forward to seeing this when the results of the Green Paper on Prevention are published later in the year."

https://www.sciencedaily.com/releases/2019/08/190802144414.htm

Read More
Memory 12 Larry Minikes Memory 12 Larry Minikes

Call it Mighty Mouse: Breakthrough leaps Alzheimer's research hurdle

Study reveals crucial mechanisms contributing to the disease

July 31, 2019

Science Daily/University of California - Irvine

University of California, Irvine researchers have made it possible to learn how key human brain cells respond to Alzheimer's, vaulting a major obstacle in the quest to understand and one day vanquish it. By developing a way for human brain immune cells known as microglia to grow and function in mice, scientists now have an unprecedented view of crucial mechanisms contributing to the disease.

 

The team, led by Mathew Blurton-Jones, associate professor of neurobiology & behavior, said the breakthrough also holds promise for investigating many other neurological conditions such as Parkinson's, traumatic brain injury, and stroke. The details of their study have just been published in the journal Neuron.

 

The scientists dedicated four years to devising the new rodent model, which is considered "chimeric." The word, stemming from the mythical Greek monster Chimera that was part goat, lion and serpent, describes an organism containing at least two different sets of DNA.

 

To create the specialized mouse, the team generated induced pluripotent stem cells, or iPSCs, using cells donated by adult patients. Once created, iPSCs can be turned into any other type of cell. In this case, the researchers coaxed the iPSCs into becoming young microglia and implanted them into genetically-modified mice. Examining the rodents several months later, the scientists found about 80-percent of the microglia in their brains was human, opening the door for an array of new research.

 

"Microglia are now seen as having a crucial role in the development and progression of Alzheimer's," said Blurton-Jones. "The functions of our cells are influenced by which genes are turned on or off. Recent research has identified over 40 different genes with links to Alzheimer's and the majority of these are switched on in microglia. However, so far we've only been able to study human microglia at the end stage of Alzheimer's in post-mortem tissues or in petri dishes."

 

In verifying the chimeric model's effectiveness for these investigations, the team checked how its human microglia reacted to amyloid plaques, protein fragments in the brain that accumulate in people with Alzheimer's. They indeed imitated the expected response by migrating toward the amyloid plaques and surrounding them.

 

"The human microglia also showed significant genetic differences from the rodent version in their response to the plaques, demonstrating how important it is to study the human form of these cells," Blurton-Jones said.

 

"This specialized mouse will allow researchers to better mimic the human condition during different phases of Alzheimer's while performing properly-controlled experiments," said Jonathan Hasselmann, one of the two neurobiology & behavior graduate students involved in the study. Understanding the stages of the disease, which according to the Alzheimer's Association can last from two to 20 years, has been among the challenges facing researchers.

 

Neurobiology & behavior graduate student and study co-author Morgan Coburn said: "In addition to yielding vital information about Alzheimer's, this new chimeric rodent model can show us the role of these important immune cells in brain development and a wide range of neurological disorders."

https://www.sciencedaily.com/releases/2019/07/190731125448.htm

Read More
Memory 12 Larry Minikes Memory 12 Larry Minikes

What the brains of people with excellent general knowledge look like

Some people seem to have an answer to every general knowledge question; why?

July 31, 2019

Science Daily/Ruhr-University Bochum

The brains of people with excellent general knowledge are particularly efficiently wired. This was shown by neuroscientists at Ruhr-Universität Bochum and Humboldt-Universität zu Berlin using magnetic resonance imaging. "Although we can precisely measure the general knowledge of people and this wealth of knowledge is very important for an individual's journey through life, we currently know little about the links between general knowledge and the characteristics of the brain," says Dr. Erhan Genç from the Department of Biopsychology in Bochum. The team describes the results in the European Journal of Personality on 28 July 2019.

 

Brain images and knowledge test

The researchers examined the brains of 324 men and women with a special form of magnetic resonance imaging called diffusion tensor imaging. This makes it possible to reconstruct the pathways of nerve fibres and thus gain an insight into the structural network properties of the brain. By means of mathematical algorithms, the researchers assigned an individual value to the brain of each participant, which reflected the efficiency of his or her structural fibre network.

 

The participants also completed a general knowledge test called the Bochum Knowledge Test, which was developed in Bochum by Dr. Rüdiger Hossiep. It is comprised of over 300 questions from various fields of knowledge such as art and architecture or biology and chemistry. The team led by Erhan Genç finally investigated whether the efficiency of structural networking is associated with the amount of general knowledge stored.

 

The result: People with a very efficient fibre network had more general knowledge than those with less efficient structural networking.

 

Linking pieces of information

"We assume that individual units of knowledge are dispersed throughout the entire brain in the form of pieces of information," explains Erhan Genç. "Efficient networking of the brain is essential in order to put together the information stored in various areas of the brain and successfully recall knowledge content."

 

An example: To answer the question of which constants occur in Einstein's theory of relativity, you have to connect the meaning of the term "constant" with knowledge of the theory of relativity. "We assume that more efficient networking of the brain contributes to better integration of pieces of information and thus leads to better results in a general knowledge test," says the Bochum-based researcher.

https://www.sciencedaily.com/releases/2019/07/190731102127.htm

Read More
Memory 12 Larry Minikes Memory 12 Larry Minikes

Impaired brain activity in rats with family history of alcohol abuse

Atypical prefrontal cortex function could be target of alcohol use disorder treatment

July 29, 2019

Science Daily/Society for Neuroscience

Neural activity that reflects the intention to drink alcohol is observed in the prefrontal cortex and is blunted in rats with a family history of excessive drinking, according to research from eNeuro. This insight could lead to novel treatments for alcohol use disorders.

 

The prefrontal cortex is a brain region involved in decision-making that becomes active before a behavior is initiated, indicating intention. David Linsenbardt, Nicholas Timme, and Christopher Lapish at Indiana University Purdue University Indianapolis investigated neural activity in the prefrontal cortex to determine if it encodes the intention to consume alcohol.

 

Linsenbardt's team compared activity before and during alcohol consumption in two types of rats. One modeled a family history of alcohol abuse, while the other lacked this family history. The prefrontal cortex was active during consumption in both types of rats, but only active pre-consumption in the rats without a family history of drinking.

 

These findings suggest that the prefrontal cortex directly encodes the intention to consume alcohol but less so in those with greater risk of abusing alcohol. Restoring prefrontal cortex activity in individuals with a predisposition to over-drink could be a new approach for treating alcohol use disorders.

https://www.sciencedaily.com/releases/2019/07/190729132334.htm

Read More
Memory 12, Obesity and Diet 8 Larry Minikes Memory 12, Obesity and Diet 8 Larry Minikes

Warning to those wanting to spice up their lives

July 22, 2019

Science Daily/University of South Australia

Think twice before adding that extra kick of chili sauce or chopped jalapeno to your meal. New research involving the University of South Australia shows a spicy diet could be linked to dementia.

 

A 15-year study of 4582 Chinese adults aged over 55 found evidence of faster cognitive decline in those who consistently ate more than 50 grams of chili a day. Memory decline was even more significant if the chili lovers were slim.

 

The study, led by Dr Zumin Shi from Qatar University, showed that those who consumed in excess of 50 grams of chili a day had almost double the risk of memory decline and poor cognition.

 

"Chili consumption was found to be beneficial for body weight and blood pressure in our previous studies. However, in this study, we found adverse effects on cognition among older adults," Dr Zumin says.

 

UniSA epidemiologist Dr Ming Li, one of five researchers involved in the study, says chili intake included both fresh and dried chili peppers but not sweet capsicum or black pepper.

 

"Chili is one of the most commonly used spices in the world and particularly popular in Asia compared to European countries," Dr Li says. "In certain regions of China, such as Sichuan and Hunan, almost one in three adults consume spicy food every day."

 

Capsaicin is the active component in chili which reportedly speeds up metabolism, fat loss and inhibits vascular disorders but this is the first longitudinal study to investigate the association between chili intake and cognitive function.

 

Those who ate a lot of chili had a lower income and body mass index (BMI) and were more physically active compared to non-consumers. Researchers say people of normal body weight may be more sensitive to chili intake than overweight people, hence the impact on memory and weight. Education levels may also play a role in cognitive decline and this link requires further research.

https://www.sciencedaily.com/releases/2019/07/190722105939.htm

Read More
Memory 12 Larry Minikes Memory 12 Larry Minikes

Apathy: The forgotten symptom of dementia

July 17, 2019

Science Daily/University of Exeter

Apathy is the most common neuropsychiatric symptom of dementia, with a bigger impact on function than memory loss -- yet it is under-researched and often forgotten in care. A new study has found that apathy is present nearly half of all people with dementia, with researchers finding it is often distinct from depression.

 

Although common, apathy is often ignored as it is less disruptive in settings such as care homes than symptoms like aggression. Defined by a loss of interest and emotions, it is extremely distressing for families and it is linked with more severe dementia and worse clinical symptoms.

 

Now, research led by the University of Exeter and presented at the Alzheimer's Association International Conference in LA has analysed 4,320 people with Alzheimer's disease from 20 cohort studies, to look at the prevalence of apathy over time.

 

At the start of the study, 45% presented with apathy, and 20% had persistent apathy over time. Researchers found that a proportion had apathy without depression, which suggests that the symptom might have its own unique clinical and biological profile when compared to apathy with depression and depression only.

 

Dr Miguel de Silva Vasconcelos, of the University of Exeter and King's College London, said : "Apathy is an under-researched and often ignored symptom of dementia. It can be overlooked because people with apathy seem less disruptive and less engaging, but it has a huge impact on the quality of life of people living with dementia, and their families. Where people withdraw from activities, it can accelerate cognitive decline and we know that there are higher mortality rates in people with apathy. It's now time this symptom was recognised and prioritised in research and understanding."

 

Professor Clive Ballard, of the University of Exeter Medical School, said: "Apathy is the forgotten symptom of dementia, yet it can have devastating consequences. Our research shows just how common apathy is in people with dementia, and we now need to understand it better so we can find effective new treatments. Our WHELD study to improve care home staff training through personalised care and social interaction included an exercise programme that improved apathy, so we know we can make a difference. This is a real opportunity for interventions that could significantly benefit thousands of people with dementia. "

https://www.sciencedaily.com/releases/2019/07/190717105335.htm

Read More
Memory 12 Larry Minikes Memory 12 Larry Minikes

Older adults: Daunted by a new task? Learn 3 instead

Learning multiple things simultaneously increases cognitive abilities in older adults

July 17, 2019

Science Daily/University of California - Riverside

Learning several new things at once increases cognitive abilities in older adults, according to new research. After just 1.5 months learning multiple tasks in a new study, participants increased their cognitive abilities to levels similar to those of middle-aged adults, 30 years younger. Control group members, who did not take classes, showed no change in their performance.

 

UCR psychologist Rachel Wu says one important way of staving off cognitive decline is learning new skills as a child would. That is, be a sponge: seek new skills to learn; maintain motivation as fuel; rely on encouraging mentors to guide you; thrive in an environment where the bar is set high.

 

"The natural learning experience from infancy to emerging adulthood mandates learning many real-world skills simultaneously," Wu's research team writes in a paper recently published in The Journals of Gerontology, Series B: Psychological Sciences.

 

Likewise, the group's hypothesis held, learning multiple new skills in an encouraging environment in older adulthood leads to cognitive growth. The prize: maintaining independence in old age.

 

Building on lifelong learning research, previous studies have demonstrated the cognitive gains of older people learning new skills, such as photography or acting. But these skills were learned one at a time, or sequentially.

 

For Wu's studies, the researchers asked adults 58 to 86 years old to simultaneously take three to five classes for three months -- about 15 hours per week, similar to an undergraduate course load. The classes included Spanish, learning to use an iPad, photography, drawing/painting, and music composition.

 

The participants completed cognitive assessments before, during, and after the studies to gauge working memory (such as remembering a phone number for a few minutes); cognitive control (which is switching between tasks) and episodic memory (such as remembering where you've parked).

 

After just 1 ½ months, participants increased their cognitive abilities to levels similar to those of middle-aged adults, 30 years younger. Control group members, who did not take classes, showed no change in their performance.

 

"The participants in the intervention bridged a 30 year difference in cognitive abilities after just 6 weeks and maintained these abilities while learning multiple new skills," said Wu, who is an assistant professor of psychology.

 

"The take-home message is that older adults can learn multiple new skills at the same time, and doing so may improve their cognitive functioning," Wu said. "The studies provide evidence that intense learning experiences akin to those faced by younger populations are possible in older populations, and may facilitate gains in cognitive abilities."

https://www.sciencedaily.com/releases/2019/07/190717084237.htm

Read More
Memory 12 Larry Minikes Memory 12 Larry Minikes

Risk and progression of Alzheimer's disease differs by sex

July 16, 2019

Science Daily/Vanderbilt University Medical Center

The abnormal accumulation of proteins in the brain is a biological marker for Alzheimer's disease, but the ways in which these proteins spread may help explain why the prevalence of Alzheimer's is higher in women than in men.

 

A recent study by researchers from the Center for Cognitive Medicine (CCI) at Vanderbilt University Medical Center identified differences in the spread of a protein called tau -- which is linked to cognitive impairment -- between men and women, with women showing a larger brain-wide accumulation of tau than men due to an accelerated brain-wide spread.

 

The findings were presented at the Alzheimer's Association International Conference July 14-18 in Los Angeles.

 

Accumulating evidence suggests that tau spreads through brain tissue like an infection, traveling from neuron to neuron and turning other proteins into abnormal tangles, subsequently killing brain cells. Using data from positron emission tomography (PET) scans of healthy individuals and patients with mild cognitive impairment who were enrolled in the Alzheimer's Disease Neuroimaging Initiative (ADNI) database, CCI researchers constructed in vivo networks modeling tau spread using graph theory analysis.

 

"It's kind of like reconstructing a crime scene after a crime. You weren't there when it happened, but you can determine where an intruder entered a house and what room they entered next," said Sepi Shokouhi, PhD, assistant professor of Psychiatry and Behavioral Sciences and lead investigator for the study. "The graph analysis does something similar to show how tau spreads from one region to another."

 

The results of the analysis showed the architecture of tau networks is different in men and women, with women having a larger number of "bridging regions" that connect various communities in the brain. This difference may allow tau to spread more easily between regions, boosting the speed at which it accumulates and putting women at greater risk for developing Alzheimer's disease.

 

If proven, an accelerated spread of tau in women may indicate a need for sex-specific approaches for the prevention of Alzheimer's disease, including earlier therapies, lifestyle interventions and/or cognitive remediation. More studies are needed to validate the accelerated tau spread model in women.

 

"Understanding how different biological processes influence our memory is a really important topic. Sex-specific differences in the brain's pathological, neuroanatomical and functional organization may map into differences at a neurobehavioral and cognitive level, thus explaining differences in the prevalence of neurodegenerative disorders and helping us develop appropriate treatments," said Shokouhi.

https://www.sciencedaily.com/releases/2019/07/190716124853.htm

Read More
Memory 12 Larry Minikes Memory 12 Larry Minikes

Exercise offers protection against Alzheimer's

July 16, 2019

Science Daily/Massachusetts General Hospital

Higher levels of daily physical activity may protect against the cognitive decline and neurodegeneration (brain tissue loss) from Alzheimer's disease (AD) that alters the lives of many older people, researchers from Massachusetts General Hospital (MGH) have found. In a paper in JAMA Neurology, the team also reported that lowering vascular risk factors may offer additional protection against Alzheimer's and delay progression of the devastating disease. The findings from this study will be presented at the Alzheimer's Association International Conference (AAIC) in Los Angeles by the first author of the study, Jennifer Rabin, PhD, now at the University of Toronto, Sunnybrook Research Institute.

 

"One of the most striking findings from our study was that greater physical activity not only appeared to have positive effects on slowing cognitive decline, but also on slowing the rate of brain tissue loss over time in normal people who had high levels of amyloid plaque in the brain," says Jasmeer Chhatwal, MD, PhD of the MGH Department of Neurology, and corresponding author of the study. The report suggests that physical activity might reduce b-amyloid (Ab)-related cortical thinning and preserve gray matter structure in regions of the brain that have been implicated in episodic memory loss and Alzheimer's-related neurodegeneration.

 

The pathophysiological process of AD begins decades before clinical symptoms emerge and is characterized by early accumulation of b-amyloid protein. The MGH study is among the first to demonstrate the protective effects of physical activity and vascular risk management in the "preclinical stage" of Alzheimer's disease, while there is an opportunity to intervene prior to the onset of substantial neuronal loss and clinical impairment. "Because there are currently no disease-modifying therapies for Alzheimer's disease, there is a critical need to identify potential risk-altering factors that might delay progression of the disease," says Chhatwal.

 

The Harvard Aging Brain Study at MGH assessed physical activity in its participants -- 182 normal older adults, including those with elevated b-amyloid who were judged at high-risk of cognitive decline -- through hip-mounted pedometers which counted the number of steps walked during the course of the day.

 

"Beneficial effects were seen at even modest levels of physical activity, but were most prominent at around 8,900 steps, which is only slightly less than the 10,000 many of us strive to achieve daily," notes co-author Reisa Sperling, MD, director of the Center for Alzheimer's Research and Treatment, Brigham and Women's Hospital and Massachusetts General Hospital and co-principal investigator of the Harvard Aging Brain Study.

 

Interventional approaches that target vascular risk factors along with physical exercise have added beneficial properties, she adds, since both operate independently. Vascular risk factors measured by the researchers were drawn from the Framingham Cardiovascular Disease Risk Score Calculator, and include age, sex, weight, smoking/non-smoking, blood pressure, and whether people are on treatment for hypertension.

 

Through ongoing studies MGH is working to characterize other forms of physical activity and lifestyle changes that may help retard the progress of Alzheimer's disease. "Beta amyloid and tau protein build-up certainly set the stage for cognitive impairment in later age, but we shouldn't forget that there are steps we can take now to reduce the risk going forward -- even in people with build-up of these proteins," says Chhatwal. "Alzheimer's disease and the emergence of cognitive decline is multifactorial and demands a multifactorial approach if we hope to change its trajectory."

https://www.sciencedaily.com/releases/2019/07/190716193543.htm

Read More