NC State News News from NC State University 2016-12-09T15:24:34Z WordPress admin <![CDATA[Nile Project Concert Coming in March]]> 2016-12-08T16:55:31Z 2016-12-08T16:55:31Z 0 admin <![CDATA[Purchase New Technology with Payroll Deduction]]> 2016-12-08T17:13:11Z 2016-12-08T16:25:03Z 0 Carla Davis <![CDATA[NC State Recognized as Bike-Friendly Campus]]> 2016-12-08T17:02:14Z 2016-12-08T16:13:01Z 0 University Communications <![CDATA[2016 Chancellor’s Report]]> 2016-12-08T15:58:12Z 2016-12-08T15:58:12Z 0 admin <![CDATA[Castellano Named Goodnight Distinguished Chair]]> 2016-12-08T16:08:27Z 2016-12-08T15:47:00Z 0 Matt Shipman <![CDATA[Drug Use Strong Predictor for Postpartum Mental Health Problems]]> 2016-12-07T19:32:56Z 2016-12-08T11:30:57Z New research from North Carolina State University and the University of British Columbia finds that a woman’s lifetime history of drug use can help predict whether the woman will suffer from problems with stress and anxiety after childbirth. The finding could help health-care providers screen pregnant women for mental health problems and provide relevant treatment.

“There’s been a lot of attention recently on the need to incorporate mental health screening into prenatal care, and it has largely focused on identifying women who are at risk of postpartum depression,” says Sarah Desmarais, an associate professor of psychology at NC State and co-author of a journal article on the work.

“Our study has two important findings that are relevant to that discussion,” Desmarais says. “First, we found that women are at risk of significant postpartum mental health problems other than depression – stress and anxiety are serious issues that merit attention. Second, by incorporating questions about a woman’s history of drug use, we can help health-care providers more accurately identify women who are at risk of postpartum stress and anxiety – and take steps to provide the necessary care.”

The study was not designed to focus specifically on drug use, but was instead aimed at answering the broader question of whether women’s use of alcohol and drugs at any point in their lifetime predicted mental health challenges after childbirth.

“Historically, a lot of research focused on women’s substance use during pregnancy,” Desmarais says. “We thought that may not be a reliable way of capturing women’s substance use, because women are likely less willing to admit to substance use during pregnancy – they’re concerned about losing parental custody, dealing with social stigma, or biasing their treatment and care. What’s more, pregnancy is not when women begin using drugs or alcohol; that’s something that carries over from a woman’s behavior before pregnancy.”

To examine these issues, researchers used data from interviews with 100 women in British Columbia who had given birth in the previous three months, were largely from higher socioeconomic backgrounds and were not considered at high risk of postpartum mental health problems. The study participants were recruited to join a broad health and wellness study, which was not specifically focused on substance use.

In those interviews, women were asked about their history of alcohol use and their history of drug abuse.

“The key finding is that asking about lifetime drug use really helped us predict whether a woman would experience postpartum mental health problems,” Desmarais says.

“The best predictor of postpartum mental health problems is still whether a woman has a history of mental health problems,” Desmarais adds. “But when you include a history of drug use, the likelihood increases significantly.”

Specifically, prior drug use was associated with heightened symptoms of stress and anxiety after childbirth. Drug use was not associated with postpartum depression, and prior alcohol use was not associated with any postpartum mental health problems.

The paper, “Lifetime substance use as a predictor of postpartum mental health,” is published in the journal Archives of Women’s Mental Health. Lead author of the study is Betty-Shannon Prevatt, a Ph.D. student at NC State. The paper was co-authored by Patricia Janssen of UBC. The work was done with support from the British Columbia Mental Health and Addictions Research Network, the Social Sciences and Humanities Research Council of Canada, and the Michael Smith Foundation for Health Research.


Note to Editors: The study abstract follows.

“Lifetime substance use as a predictor of postpartum mental health”

Authors: Betty-Shannon Prevatt and Sarah L. Desmarais, North Carolina State University; Patricia A. Janssen, University of British Columbia

Published: Dec. 3, Archives of Women’s Mental Health

DOI: 10.1007/s00737-016-0694-5

Abstract: Postpartum mood disorders (PPMD) affect approximately 10–20% of women and have adverse consequences for both mom and baby. Lifetime substance use has received limited attention in relation to PPMD. The present study examined associations of lifetime alcohol and drug use with postpartum mental health problems. Women (n = 100) within approximately 3 months postpartum (M = 2.01, SD = 1.32) participated in semi-structured interviews querying lifetime substance use, mental health history, and postpartum symptoms of anxiety, stress, posttraumatic stress disorder (PTSD), depression, and obsessive compulsive disorder. The study was conducted in an urban Canadian city from 2009 to 2010. Analyses revealed that lifetime substance use increased the variability explained in postpartum PTSD (p = .011), above and beyond sociodemographic characteristics and mental health history. The same trend, though not significant, was observed for stress (p = .059) and anxiety (p = .070). Lifetime drug use, specifically, was associated with postpartum stress (p = .021) and anxiety (p = .041), whereas lifetime alcohol use was not (ps ≥ .128). Findings suggest that lifetime drug use is associated with PPMD. Future research should examine whether screening for lifetime drug use during antenatal and postpartum care improves identification of women experiencing PPMD.

Dee Shore <![CDATA[Renowned Chef Howard Speaks on Campus]]> 2016-12-07T21:32:27Z 2016-12-07T21:13:14Z 0 Tracey Peake <![CDATA[When Neurons are ‘Born’ Impacts Olfactory Behavior in Mice]]> 2016-12-06T21:33:08Z 2016-12-06T21:28:30Z New research from North Carolina State University shows that neurons generated at different life stages in mice can impact aspects of their olfactory sense and behavior. The work could have implications for our understanding of neurodevelopmental processes or traumatic brain injuries in humans.

Troy Ghashghaei, associate professor of neurobiology at NC State, studies the ways neurons develop and integrate into the “circuitry” of the brain. Mice are an excellent model for study, because even in adulthood they continue to produce neurons in two regions, one of which deals with the smell, or olfactory, centers of the brain.

Working with a population of young adult mice, Ghashghaei and his team looked at olfactory neurons that were generated when the mice were either newborn or young adults. The team wanted to know if there was a difference between the function of neurons that developed at different life stages.

“One way to study the function of different populations of neurons is to shut them off during different behavioral paradigms,” says Ghashghaei.  To shut off neurons, they introduced a gene into olfactory stem cells in the mice. The gene encoded a protein that would respond to a particular drug by turning off those olfactory neurons after they had matured. Thus the researchers could shut off neurons that were generated at different developmental time points.

In young adult mice, stopping activity of adult-born neurons affected their ability to recognize and develop memories for novel food odors – odors that they had never been exposed to before. In contrast, if the odor was aversive, or indicated danger – like the scent of a fox, for example – shutting off the adult-generated neurons had no effect; the mice responded normally by freezing in place. The adult-generated neurons, therefore, did not appear to have a role in mediating the innate response the mice had to aversive odors.

Puzzled by this finding, Ghashghaei and the team wondered if neurons generated immediately after mice are born were connected with responses to aversive odors. So they shut off ‘early-born’ neurons in the mice and found that the usual response to aversive odors was interrupted: the mice seemed unaffected by presence of a fox odor.

“Developmentally, there is a progression of neuronal addition to the olfactory system in mice,” Ghashghaei says. “What this study demonstrates is that there are developmentally defined circuits – generated at specific points in time – that regulate different values of new sensory stimuli, and how sensory responses are processed and learned.

“The next questions to explore are how specific sets of neurons, generated at specific points in time, work together in complex behaviors, and how they may or may not be working in neurodevelopmental diseases or in conditions such as autism. Additionally, we want to look at how the neurons we have discovered are wired to other brain regions and whether or not these networks are responsible for regulating hedonic aspects of sensory perception.”

The work appears in the journal Nature Neuroscience. Ghashghaei is funded by the National Institutes of Health (grants R01NS098370 and R01NS089795). Former NC State postdoctoral researcher currently at UNC-Chapel Hill Nagendran Muthusamy, NC State graduate student Caroline Johnson, and research associate Xuying Zhang contributed to the work.


Note to editors: An abstract of the paper follows.

“Developmentally defined forebrain circuits regulate appetitive and aversive olfactory learning”

DOI: 10.1038/nn.4452

Authors: Troy Ghashghaei, Xuying Zhang, Caroline Johnson, NC State University; Nagendran Muthusamy, UNC-Chapel Hill; Prem Yadav, CSIR-Central Drug Research Institute, Lucknow, India
Published: in Nature Neuroscience

Abstract: Postnatal and adult neurogenesis are region- and modality specific, but the significance of developmentally distinct neuronal populations remains unclear. We demonstrate that chemogenetic inactivation of a subset of forebrain and olfactory neurons generated at birth disrupts responses to an aversive odor. In contrast, novel appetitive odor learning is sensitive to inactivation of adult-born neurons, revealing that developmentally defined sets of neurons may differentially participate in hedonic aspects of sensory learning.

David Hunt <![CDATA[McKeand Nabs Top Service Award for Work With Pines]]> 2016-12-05T15:40:00Z 2016-12-05T14:54:02Z Steve McKeand gets funny looks when he tells people what he does for a living. “They have one of two reactions: stunned silence, or a giggle,” he says.

It’s not easy being a tree breeder, it seems. But it’s no laughing matter, either.

McKeand, a professor of forestry and environmental resources in the College of Natural Resources at NC State University and director of the university’s Cooperative Tree Improvement Program, is among a select group of scientists responsible for enhancing one of the nation’s most important natural resources — the estimated 39 million acres of planted pine forests stretching across the United States from central Texas to southern New Jersey.

His work is vital, not just for giving loblolly and longleaf pine trees a genetic advantage in the face of threats such as disease, pests and a changing climate, but also for giving the owners of forest land a competitive advantage in the market.

In recognition of his successful efforts on both counts, McKeand received the Governor James E. Holshouser Jr. Award for Excellence in Public Service from the Board of Governors of the University of North Carolina. The annual award was established in 2007 to encourage, identify, recognize and reward public service by faculty of the university system.

In the Treetops

Tree breeding is a time-intensive endeavor, says McKeand, who spends much of his time out in the woods and at the tops of trees in the breeding orchards operated by the program’s members throughout the Southeast. He calls these orchards “living warehouses” of the best loblolly pine trees in the region.

The process itself is straightforward, he says, ticking off the steps involved in tree breeding in rapid order.

“We go out and look for good trees. We select them and put a tag on them. Then we take cuttings off of them and graft the shoots into the tops of large trees so they will produce cones in two to three years,” he explains. “Then we do the breeding, collect the cones, extract the seeds and test the offspring for five or six years in field trials around the South.”

McKeand and his colleagues use this cross-breeding technique to develop trees with specific traits prized by growers. Primarily, they’re looking for fast-growing trees that are resistant to disease and have desirable wood properties and straight stems.

When the Tree Improvement Program was established by professor Bruce Zobel in 1956, it took a while to catch on with North Carolina growers. “There were a lot of skeptics at first,” McKeand says. “But it’s been a tremendous success. We’ve had huge impacts on productivity and value.”

One measure of success is the popularity of PRS, a rating system that helps landowners select seedlings that have been bred for specific traits, such as wood production, rust resistance or stem quality.

“In the old days, you took whatever the nurseries had available. The genetics were good but maybe not optimal for the specific objectives of each landowner,” he says. “Landowners now have a huge range of options available to them.”

In fact, genetic traits coaxed from pine trees in the program’s orchards are now found in about 60 percent of the trees planted in commercial forests in the United States.

Public-Private Model

The program’s success is based on its innovative operating model: It’s a membership organization comprising forestry companies, landowners, nurseries and public agencies. From the original 12 members in 1956, the program has grown to include more than 30 full and contributing members and four research associate members.

“Forestry is big business in North Carolina,” McKeand explains. “There are over half a million forest landowners in this state and almost five million in the South.”

For many families, forest land is an important source of income, generating funds to cover the cost of college tuition for their children or to ensure a more secure retirement. Selecting a tree crop to plant is a much greater long-term investment than choosing most agricultural crops.

“If you’re planting your family land, you want to make sure you’re planting trees with the right genetics, because you’re going to be stuck with them for 25 or 30 years. You don’t want to end up saying, ‘oops,’” he says. “Unfortunately, there are an awful lot of those ‘oops’ acres out there.”

For all its value, the program’s growth wasn’t assured. A reduction in the number of forest product companies in the last 10 to 15 years threatened to undercut the program’s funding base.

“They started merging and buying each other out, and then, for a variety of reasons, the big, vertically integrated forestry product companies sold off their land to investors or restructured their companies,” McKeand says.

Faced with the risk of abandoning decades of work in tree breeding, McKeand struggled to find a solution. “We had to become creative,” he says. “We had to find a way to sustain this.”

Taking a page from industry, McKeand and his colleagues hit on an idea to attract new partners. They restructured the program to add a membership level for the new players in the industry, such as timber investment management organizations, or TIMOs, real estate investment firms, consulting firms, nurseries, sawmill owners and large landowners.

“The amount of forest acreage hadn’t declined,” he explains. “But a lot of land had been purchased by TIMOs and other investors. These organizations were very interested in tree improvement, but they didn’t want to do tree improvement.”

As cooperative members of the Tree Improvement Program, the contributing members have access to information derived from the program’s genetic research. “That allows them to make wise decisions about reforestation,” McKeand says.

Students Benefit

Revenue from new members has not only helped save the breeding program; it has also allowed McKeand and his colleagues to expand opportunities for top undergraduate and graduate students.

“We are able to provide them with income opportunities,” he says. “But they also get a heck of a lot of experience doing good forestry, good lab work, field work and greenhouse work.”

McKeand, who launched his own career in forestry four decades ago at Purdue University, sees a bright future for today’s students.

“Back in the day, if you were in tree improvement you basically worked in forestry,” he explains. “Today there are a variety of opportunities. We teach the fundamentals of genetics and genomics, so we’ve had graduates go on to work in the agronomic industry, in functional genomics, in crop-breeding programs as well as forestry.”

To open that path to as many young minds as possible, McKeand and his family recently established the McKeand Family Scholarship endowment in NC State’s College of Natural Resources. The scholarship is open to incoming freshmen in forestry and environmental resources, with a preference given to students from Millbrook High School in Raleigh, where McKeand’s wife, Lou, is a longtime chemistry teacher.

McKeand is donating the cash award he received with the Holshouser Award to further increase the scholarship endowment.

“Looking back, I can’t think of a career that could possibly have been more gratifying,” he says. “The university and my college and department have provided me with the opportunity to be creative, to be innovative, to be practical, to do work with students, to do research, and most important, to make a difference.”

admin <![CDATA[Woodsons Host Holiday Reception on Dec. 7 at The Point]]> 2016-12-02T14:50:11Z 2016-12-01T21:57:26Z Chancellor Randy Woodson and his wife Susan invite all NC State faculty and staff to to attend their annual holiday open house on Wednesday, Dec. 7, from 11 a.m. to 1 p.m. at The Point.

Parking is available at the residence, as well as in the nearby Park Alumni Center lot. Also for convenience, a shuttle will run continuously between the surface parking lot (facing Cates Avenue) at the Coliseum Parking Deck on main campus and The Point.

Festive food will be served.


Christy Sadler <![CDATA[Outfitting the Future]]> 2016-12-05T13:54:52Z 2016-12-01T21:54:49Z Wearable technology is about more than smartwatches or counting steps. Across NC State, researchers are using it to solve problems — monitoring heart rate and environmental dangers, powering electronic devices, delivering medications, building better prosthetics and improving safety.

They’re developing technologies that are functional, efficient, innovative and practical, and that could have an impact on countless lives.

Here are a few of the NC State projects at the forefront of this evolving field.

What’s NEXT in Wearables

What if the clothes you already wear not only covered your body but also kept track of how it’s functioning — and all you had to do was put them on?

A man in a research lab holds up a red wearable tech T-shirt.
Jesse Jur of the NEXT group holds up a T-shirt containing iron-on sensors.

Finding innovative, useful and economical ways to integrate electronics into clothing is the mission of the College of TextilesNano-Extended Textiles (NEXT) Research Group.

Headed by Jesse Jur, assistant professor in the Department of Textile Engineering, Chemistry and Science, the NEXT group seeks to create cost-effective, energy-efficient wearable technology that’s powered by the user’s own body.

Jur’s team has gained attention for projects like customizable, iron-on sensors that monitor the heart’s performance and transmit the readings to a smartphone, or that monitor environmental levels of potentially dangerous gases like carbon monoxide and ozone.

The NEXT group has also explored bioluminescence in fashion through a collaboration with recent College of Textiles graduate Jazsalyn McNeil, who joined the group as a “fusion designer” to meld her design sensibility with the group’s research. McNeil’s Pulse Dress incorporates screen-printed sensors that make LED lights blink with the wearer’s heartbeat. NEXT and McNeil hope that the eye-catching dress will both influence fashion and draw attention to the possibilities of wearable electronics.


Heating Up Wearable Tech

In recent years, smartwatches have turned up on the arms of millions of people who want convenient ways to keep track of their fitness, but these still depend on conventional batteries. At NC State’s Center for Advanced Self-Powered Systems of Integrated Sensors and Technologies (ASSIST) — a National Science Foundation Nanosystems Engineering Research Center — researchers are developing innovative health-monitoring devices that are battery-free and body-powered.

“The goal of ASSIST is to make wearable technologies that can be used for long-term health monitoring, such as devices that track heart health or monitor physical and environmental variables to predict and prevent asthma attacks,” said Daryoosh Vashaee, an associate professor of electrical and computer engineering in the NC State College of Engineering.

Red and grey NC State T-shirts featuring thermoelectric generators.
A T-shirt (left) and armband (right) embedded with thermoelectric generators.

Vashaee and a team of undergraduates and faculty members have developed a new approach for harvesting body heat and converting it into electricity to power wearable electronics. The prototype armbands and embedded sensors in T-shirts are lightweight, conform to the shape of the body and can generate far more electricity than previous lightweight heat-harvesting technologies.

“We want to make devices that don’t rely on batteries,” Vashaee said. “And we think this design and prototype moves us much closer to making that a reality.”

Taking the Sting Out of Diabetes

For some people with serious health issues, wearable technology has the potential to offer more than bells and whistles — it could make their treatments easier and even save lives.

Zhen Gu, an associate professor in the UNC/NC State Joint Department of Biomedical Engineering, has developed a glucose-responsive insulin patch for people living with Type 1 Diabetes. At around the size of a penny, the thin, square patch contains more than a hundred tiny, painless needles that supply the wearer with insulin as needed. This potential treatment could help to ensure consistent blood-sugar levels — and spare patients regular injections.

Zhen Gu and student work in his lab
Zhen Gu and his team work in his biomedical engineering lab on Centennial Campus.

Gu, who has been honored as one of MIT Technology Review’s “Innovators Under 35” for his work with innovative drug-delivery systems, received $4.6 million in funding from JDRF (formerly the Juvenile Diabetes Research Foundation) and multinational pharmaceutical company Sanofi for the project. The patch is currently in animal trials. Gu is also working on patches to deliver melanoma drugs directly to tumor sites and to deliver blood thinners as needed to prevent blood clots.

Walking Wearables

Amputees have always been among the earliest adopters of wearable technology, as even minor advances in prosthetics can markedly improve their mobility. Helen Huang, associate professor of biomedical engineering and director of the Rehabilitation Engineering Core in the UNC/NC State Joint Department of Biomedical Engineering, has made it her mission to develop the next generation of powered prosthetic limbs.

Huang’s projects include software that allows powered prosthetics to tune themselves automatically, making the devices more responsive and lowering the costs associated with powered prosthetic use.

Prosthetics research in Helen Huang's lab
An automatically tuned prosthetic in action in Helen Huang’s lab.

“People are dynamic — a patient’s physical condition may change as he or she becomes accustomed to a prosthetic leg, for example, or they may gain weight,” said Huang. “These changes mean the prosthetic needs to be re-tuned, and working with a prosthetist takes time and money.”

Huang’s team has also worked on technology that translates electrical signals in human muscles into signals that control powered prosthetic limbs — enabling sensors in the prosthetics to follow simple cues from the user’s brain such as “open hand” or “close hand.”

A Bright Idea for Safety

For College of Textiles alumnus Jeremy Wall, a near miss with a car while he was riding his bike one night became an unexpected source of inspiration: He now heads a company, Lumenus, that’s developing clothing and accessories with embedded smart LED lighting.

Wall, a 2014 graduate in fashion and textile management, began working on his tech with the help of an undergraduate research scholarship while he was still a student. His goal was to help cyclists, motorcyclists and runners be more visible to motorists at night while staying stylish and functional during the day.

Jeremy Wall, founder of Lumenus
Alumnus Jeremy Wall, founder of Lumenus, outside the NC State College of Textiles.

The company will soon hit the market with apparel and accessories including jackets, vests, leggings, backpacks and armbands. It’s also licensing its technology to companies such as backpack manufacturer Timbuk2 and working with the Department of Defense to develop sensors for military gear.

Lumenus has also created an app that adds extra features to the apparel. For example, the wearer can enter a destination on the app, and the LED lights on the garment will flash strategically at intersections or other potentially hazardous points along the route.

Wall recently returned to NC State for help getting his company off the ground, enlisting three College of Textiles undergraduates to work with Lumenus as part of their senior design project.

Matt Shipman <![CDATA[Don’t Trust Your Waiter for Food Safety Advice]]> 2016-12-01T16:19:20Z 2016-12-01T16:19:20Z I went out for lunch recently at an upscale restaurant. Other guests wore suits, there was an extensive wine list, and the server was extremely upbeat. What she didn’t know, and I did, was that my guest for lunch was a food safety expert – and her tableside manner was being judged.

Shortly after being seated, my dining companion pointed to the bottom of the menu.

“Consuming undercooked meats may increase risk of foodborne illness,” said Ben Chapman, a food safety researcher at NC State. “It’s right there on the menu. Now let’s see if the server follows through.”

When the server returned, Chapman ordered a medium-rare hamburger. The server didn’t mention anything, so Chapman asked how the restaurant knew whether the burger would be safe to eat.

The server said that the cooks could tell whether the hamburger was safe by feeling how firm the burger was, and noted that lots of people order medium-rare hamburgers and don’t get sick. Chapman changed his order to well-done anyway, and the server left to get our drinks.

“This,” Chapman said, “is basically everything that can go wrong with how restaurant servers share food safety information with consumers: the menu gives patrons vague, but accurate, information. And the server gave us information that’s inaccurate and not based on the science.”

And Chapman knows what he’s talking about – he just published a paper evaluating how restaurants handle food safety communication, based on the experiences of “secret shoppers” at 265 different restaurants scattered across the United States. You can read more about that paper here.

So what does make a hamburger safer?

  • Cooking hamburgers to 155°F for 15 seconds or 160°F (for an instant kill).
  • Restaurants are required to cook to these temperatures in many jurisdictions unless requested to do otherwise by a customer.
  • Restaurants should have thermometers in the kitchen; if they don’t, you may want to reconsider your dining choice.
  • Don’t trust color (no red or pink) as an indicator of safety.
  • Just because the juices are “running clear” doesn’t mean the burger has reached a safe temperature.
  • The touch, feel or look of the meat are not reliable ways of determining how well cooked the hamburger is.

The study, which was performed by a team of researchers from NC State and RTI International, is published in the Journal of Food Protection.

Matt Shipman <![CDATA[Study: Restaurants Not Good At Explaining Risks of Undercooked Meat to Customers]]> 2016-12-01T16:20:48Z 2016-12-01T16:12:24Z Front-line staff, such as servers in restaurants, are often trusted with providing customers with food safety information regarding their meals. A challenge to the food-service industry is that these positions have high turnover, relatively low wages and servers are focused primarily on providing patrons with a positive experience. And new research shows that this poses a problem.

A recent study finds restaurants don’t do an effective job of communicating with customers when it comes to addressing risks associated with eating undercooked meat – specifically hamburgers. Inaccurate information provided by servers often contradicts science-based information customers need to make informed food safety decisions.

All 50 states in the U.S. have adopted some version of the Food & Drug Administration’s Model Food Code, which requires restaurants to tell customers about risks associated with undercooked meat and poultry products.

“We wanted to know how well restaurant servers and menus communicated with customers about these risks, specifically in the context of beef hamburgers,” says Ben Chapman, co-author of a study on the work and an associate professor at North Carolina State University whose research program is aimed at improving food safety.

The researchers focused on beef hamburgers because consuming undercooked ground beef has been linked to a lot of foodborne illness outbreaks, including outbreaks related primarily to Shiga toxin-producing E. coli.

For this study, the researchers sent trained “secret shoppers” into 265 full-service, sit-down restaurants in seven different regions around the U.S. At each restaurant, the patrons ordered one well-done hamburger and one medium-rare hamburger to go. The shoppers then recorded how, if at all, the restaurant communicated about risk.

This study is the latest in a long line of real-world research that Chapman and his collaborators have conducted.

“We try to actually match what people do versus what they say they do because people will say anything on a survey,” Chapman says. “We’ve looked at cooking shows; observed handwashing and cross-contamination in commercial kitchens; examined hand hygiene during a norovirus outbreaks and others. What people actually do is the difference between an enjoyable meal and a foodborne illness.

“For example,” Chapman says, “did the server mention risks associated with undercooked meat when the shopper ordered? If not, the shopper would ask about the risk of getting sick, and then record whether the wait staff responded with clear, accurate information.”

The shoppers also looked to see whether restaurants included clear, accurate risk information on their menus.

The study found that 25 percent of restaurants wouldn’t even sell an undercooked hamburger to secret shoppers. However, at restaurants that would sell a medium-rare hamburger, the majority of servers – 77 percent – gave customers unreliable information about food safety.

“Servers said that meat was safe because it was cooked until ‘until the juices ran clear’ – which is totally unreliable,” says Ellen Thomas, a food safety scientist at RTI International and lead author of the study who worked on the project while a Ph.D. student at NC State. “Those 77 percent didn’t mention things like cooking meat to the appropriate temperature – either 155°F for 15 seconds, or 160°F for instant kill.

“The indicator of safety most widely reported by servers was the color of the burger, and that’s also not a reliable indicator at all,” Thomas says “Time and temperature are all that matter. An undercooked, unsafe burger can be brown in the middle, and a safely cooked burger can still be red or pink in the center.”

Meanwhile, almost all of the menus complied with FDA guidance. But what servers told customers often contradicted the information on the menu.

“If a menu says something is risky but a server says that it isn’t, that can downplay the risks for consumers and impact a customer’s decisions,” Chapman says. “It’s confusing, leaving the patron to choose which message to believe”

The researchers also found that chain restaurants fared much better than independent restaurants at having servers offer reliable risk information. See this related blog post for an example.

“That’s not surprising,” Chapman says. “Large chains implement standardized training across all outlets for servers in order to protect their brand and reduce the likelihood of being implicated in a foodborne illness outbreak. That’s bad for business.

“This study tells us that servers aren’t good risk communicators,” Chapman says. “We encourage consumers to ask food-safety questions, but they should probably ask a manager.

“It also tells us that we need to work on addressing the widespread – and wrong – belief that color is a reliable indicator of food safety in meat,” Chapman adds. “Restaurants are in a position to help us share this information with consumers, but many servers are currently sharing incorrect information.”

The paper, “Assessment of Risk Communication about Undercooked Hamburgers by Restaurant Servers,” is published in the Journal of Food Protection. The paper was co-authored by Andrew Binder, Anne McLaughlin, Lee-Ann Jaykus, and Dana Hanson of NC State; and by Doug Powell of The research was supported by Agriculture and Food Research Initiative Competitive Grant no. 2012-68003-30155 from the USDA National Institute of Food and Agriculture.


Note to Editors: The study abstract follows.

“Assessment of Risk Communication about Undercooked Hamburgers by Restaurant Servers”

Authors: Ellen M. Thomas, RTI International; Andrew Binder, Anne McLaughlin, Lee-Ann Jaykus, Dana Hanson, and Benjamin Chapman, North Carolina State University; and Doug Powell, Powell Food Safety

Published: Dec. 1, Journal of Food Protection

DOI: 10.4315/0362-028X.JFP-16-065

Abstract: According to the U.S. Food and Drug Administration 2013 Model Food Code, it is the duty of a food establishment to disclose and remind consumers of risk when ordering undercooked food such as ground beef. The purpose of this study was to explore actual risk communication activities of food establishment servers. Secret shoppers visited restaurants (n=265) in seven geographic locations across the U.S., ordered medium rare burgers, and collected and coded risk information from chain and independent restaurant menus and from server responses. The majority of servers reported an unreliable method of doneness (77%) or other incorrect information (66%) related to burger doneness and safety. These results indicate major gaps in server knowledge and risk communication, and the current risk communication language in the Model Food Code does not sufficiently fill these gaps. Furthermore, should servers even be acting as risk communicators? There are numerous challenges associated with this practice including high turnover rates, limited education, and the high stress environment based on pleasing a customer. If it is determined that servers should be risk communicators, food establishment staff should be adequately equipped with consumer advisory messages that are accurate, audience-appropriate, and delivered in a professional manner so as to help their customers make more informed food safety decisions.

admin <![CDATA[Gift Creates New Opportunities for Veterans]]> 2016-12-05T15:41:23Z 2016-12-01T14:55:07Z 0 Matt Shipman <![CDATA[New Findings Boost Promise of Molybdenum Sulfide for Hydrogen Catalysis]]> 2016-12-01T14:27:17Z 2016-12-01T14:27:17Z Researchers from North Carolina State University, Duke University and Brookhaven National Laboratory have found that molybdenum sulfide (MoS2) holds more promise than previously thought as a catalyst for producing hydrogen to use as a clean energy source. Specifically, the researchers found that the entire surface of MoS2 can be used as a catalyst, not just the edges of the material.

“The key finding here is that the intrinsic catalytic performance of MoS2 is much better than the research community thought,” says Linyou Cao, an associate professor of materials science and engineering at NC State and senior author of a paper describing the work. “We’re optimistic that this can be a step toward making hydrogen a larger part of our energy portfolio.”

Hydrogen promises clean energy, producing only water as a byproduct. But to create hydrogen for use as a clean energy source, ideally you’d be able to isolate the hydrogen gas from water – with the only byproduct being oxygen.

However, the key to creating hydrogen from water – a process called hydrogen evolution – is an efficient catalyst. Currently, the best catalyst is platinum, which is too expensive for widespread use.

Another candidate for a hydrogen evolution catalyst is MoS2, which is both inexpensive and abundant. But it has long been thought that MoS2 is of limited utility, based on the conventional wisdom that only the edges of MoS2 act as catalysts – leaving the bulk of the material inactive.

But the new findings from NC State, Duke and Brookhaven show that the surface of MoS2 can be engineered to maximize the catalytic efficiency of the material. And the key to this efficiency is the number of sulfur vacancies in the MoS2.

If you think of the crystalline structure of MoS2 as a grid of regularly spaced molybdenum and sulfur atoms, a sulfur vacancy is what happens when one of those sulfur atoms is missing.

“We found that these sulfur vacancies attract the hydrogen atoms in water at just the right strength: the attraction is strong enough pull the hydrogen out of the water molecule, but is then weak enough to let the hydrogen go,” says Cao.

The researchers also found that the grain boundaries of MoS2 , which have been speculated by the research community to be catalytically active for hydrogen evolution, may only provide trivial activity. Grain boundaries are the boundaries between crystalline domains.

The findings point to a new direction for improving the catalytic performance of MoS2 .  Currently, the most common way is to increase the number of edge sites, because of the conventional wisdom that only the edge sites are catalytically active.

“Our result indicates that grain boundaries should not be the factor to consider when thinking about improving catalytic activity,” Cao says. “The best way to improve the catalytic activities is to engineer sulfur vacancies. The edges of MoS2 are still twice as efficient at removing hydrogen atoms compared to the sulfur vacancies. But it’s difficult to create a high density of edges in MoS2 – a lot of the material’s area is wasted – whereas a large number of sulfur vacancies can be engineered uniformly across the material.”

The researchers have also found that there is a “sweet spot” for maximizing the catalytic efficiency of MoS2 .

“We get the best results when between 7 and 10 percent of the sulfur sites in MoS2 are vacant,” Cao says. “If you go higher or lower than that range, catalytic efficiency drops off significantly.”

Additionally, the researchers found that the crystalline quality of MoS2 is important to optimize the catalytic activity of the sulfur vacancies. The sulfur vacancies in high crystalline quality MoS2 showed better efficiency than those in low crystalline quality MoS2 , even when the densities of the vacancies are the same.

“In order to get the best output from sulfur vacancies, the crystalline quality of MoS2 needs to be very high,” says Guoqing Li, a Ph.D. student at NC State and lead author of the paper. “The ideal scenario would be 7 to 10 percent sulfur vacancies uniformly distributed in a single crystalline MoS2 film.”

The work was done using MoS2 thin films that are only three atoms thick. Using these engineered thin films, the researchers were able to achieve catalytic efficiency comparable to previous MoS2 technologies that relied on having two or three orders of magnitude more surface area.

“We now know that MoS2 is a more promising catalyst than we anticipated, and are fine-tuning additional techniques to further improve its efficiency,” Cao says. “Hopefully, this moves us closer to making a low-cost catalyst that is at least as good as platinum.”

The paper, “All the Catalytic Active Sites of MoS2 for Hydrogen Evolution,” is published in the Journal of the American Chemical Society. The paper was co-authored by Yifei Yu, David Peterson, Abdullah Zafar, Raj Kumar, Frank Hunte and Steve Shannon of NC State; Du Zhang, Stefano Curtarolo and Weitao Yang of Duke; and Qiao Qiao and Yimei Zhu of Brookhaven National Lab.

The work was done with support from the Department of Energy’s Office of Science, under grants DE-SC0012575 and DE-SC0012704, as well as by the National Science Foundation under grant PHY1338917.


Note to Editors: The study abstract follows.

“All the Catalytic Active Sites of MoS2 for Hydrogen Evolution”

Authors: Guoqing Li, Yifei Yu, David Peterson, Abdullah Zafar, Raj Kumar, Frank Hunte, Steve Shannon and Linyou Cao, North Carolina State University; Du Zhang, Stefano Curtarolo and Weitao Yang, Duke University; Qiao Qiao and Yimei Zhu, Brookhaven National Laboratory

Published: online Nov. 29, Journal of the American Chemical Society

DOI: 10.1021/jacs.6b05940

Abstract: MoS2 presents a promising low-cost catalyst for the hydrogen evolution reaction (HER), but the understanding about its active sites has remained to be limited. Here we present an unambiguous study for the catalytic activities of all possible reaction sites of MoS2, including edge sites, sulfur vacancies, and grain boundaries. We demonstrate that, in addition to the well-known catalytically active edge sites, sulfur vacancies provide another major active site for the HER while the catalytic activity of grain boundaries is much weaker. The intrinsic turnover frequencies (Tafel slopes) of the edge sites, sulfur vacancies, and grain boundaries are estimated to be 7.5 s-1 (65-75 mV/dec), 3.2 s-1 (65-85 mV/dec), and 0.1 s-1 (120-160 mV/dec), respectively. We also demonstrate that the catalytic activity of sulfur vacancies strongly depends on the density of the vacancies and the local crystalline structure at the proximity of the vacancies. Unlike edge sites, whose catalytic activity linearly depends on the length, sulfur vacancies show optimal catalytic activities when the vacancy density is in the range of 7-10%. And the sulfur vacancies in high crystalline quality MoS2 is higher than those in low crystalline quality MoS2, which may be related with different local crystalline structures at the proximity of the vacancies.

admin <![CDATA[Bookstores Host Madness Sale]]> 2016-12-05T15:41:46Z 2016-11-30T21:28:09Z Friday is the biggest shopping day on the NC State Bookstores calendar: The Finals Madness Sale.

The popular end-of-the-semester event features up to 40 percent off most merchandise at two on-campus locations. The sale will run from 10 a.m. until 8 p.m. at Wolfpack Outfitters in Talley Student Union and from 8:30 a.m. until 5:30 p.m. at the Pack Shop at Wolf Ridge Apartments on Centennial Campus.

The day-long sale is open to the public.


University Communications <![CDATA[Military and Veteran Resource Center Opens]]> 2016-12-05T15:40:27Z 2016-11-30T20:48:33Z NC State’s active duty, veteran and ROTC populations have a brand-new resource on campus to help meet their specific – and sometimes unique – needs.

The Military and Veteran Resource Center – opened on the first floor of Witherspoon Student Center on Nov. 28 – aims to provide a one-stop shop for students who are connected to the military to get information about campus resources such as academic support, housing, health care or to simply to make connections with others from similar backgrounds.

Adjusting to the responsibilities of being a college student is rarely easy, but service members face unique challenges. As many opt to pursue a college education after completing their military service, or generally later in life, these individuals are often older than their campus peers. Their life experiences differ significantly from those of students who are not affiliated with the military. The transition from a structured military setting into a university environment may present frustrations or challenges. Some may struggle to balance coursework and scheduling with ongoing military responsibilities. Additionally, they may live off campus or have family obligations at home, making it difficult to get to know fellow students or to feel connected to the campus community. New students may also find the process of seeking out necessary resources, many of which are spread across campus, to be daunting.

The Military and Veteran Resource Center will reduce those hurdles, allowing students to efficiently gain access to the many services and resources that are available at NC State, says Nicholas Drake, the new director of military and veteran services who is entering his 11th year of military service.

“I’m most excited about establishing a partnership with the greater community, because there are so many organizations that want to work with and hire this population after they graduate,” Drake said.

Drake is also committed to creating a sense of community and pride for service members. The resource center will provide opportunities for servicemen and women to gather socially and participate in experiences such as service learning and the Student Veteran Summit, part of Wolfpack Welcome Week.

Providing resources that are specific to the experiences of service members has become a best practice in higher education, Drake says. As greater numbers of military personnel seek to advance their education, institutions across the country are finding innovative ways to support this population.

NC State has a rich military history, steeped in tradition. Recently, the university was rated as a military-friendly university by Military Friendly, an independent survey-driven publishing organization.

“In recent years, NC State has taken great strides toward providing a campus community that is inclusive and welcoming for our military population,” Drake said. “This center is another crucial step toward ensuring that we are equipped to meet the needs of our servicemen and women.”

Emily Packard <![CDATA[University Faculty Scholars Named]]> 2016-12-01T18:23:53Z 2016-11-30T14:02:59Z NC State Chancellor Randy Woodson announced the 2016-17 class of University Faculty Scholars today. The 22 recipients represent top early- and mid-career faculty who are pursuing research to solve society’s most pressing problems.

University Faculty Scholars carry their title for a five-year period and receive a $10,000 annual award for supplemental salary and benefits, or for programmatic support. The Provost’s Office oversees the program, which was established by Chancellor Woodson in 2012.

Faculty members are nominated by their colleges and selected by a committee of senior faculty. Nominations are limited to assistant professors who have been reappointed for a second term, all associate professors and professors within the first three years of appointment at that rank. Nominees are evaluated on their research and scholarship productivity, excellence in teaching and mentoring, and leadership in extension, professional societies and public service initiatives.

This year’s class of University Faculty Scholars includes:

  • Margaret Blanchard, associate professor of science, technology, engineering and math (STEM) education
  • Lorena Bociu, assistant professor of mathematics
  • Kofi Boone, associate professor of landscape architecture
  • Sarah Bowen, associate professor of sociology and anthropology
  • Lisa Chapman, associate professor of textile and apparel, technology and management
  • Chih-Hao Chang, assistant professor of mechanical and aerospace engineering
  • Ke Cheng, associate professor of molecular biomedical sciences
  • Owen Duckworth, associate professor of crop and soil sciences
  • Ryan Emanuel, associate professor of forestry and environmental resources
  • Troy Ghashghaei, associate professor of molecular biomedical sciences
  • Zhen Gu, associate professor of biomedical engineering
  • Douglas Irving, associate professor of materials science and engineering
  • Audrey Jaeger, professor of educational leadership, policy and human development
  • Xiaoning Jiang, professor of mechanical and aerospace engineering
  • Steve McDonald, associate professor of sociology and anthropology
  • Anne McLaughlin, associate professor of psychology
  • Fay Cobb Payton, professor of information technology
  • Joshua Pierce, assistant professor of chemistry
  • Balaji Rao, associate professor of chemical and biomolecular engineering
  • Chris Reberg-Horton, associate professor of crop and soil sciences
  • Ana-Maria Staicu, associate professor of statistics
  • Wenqiao Yuan, associate professor of biological and agricultural engineering
Matt Shipman <![CDATA[Writing for an Academic Audience]]> 2016-11-29T19:11:20Z 2016-11-29T19:11:20Z Editor’s Note: This is a guest post by Natalie Ames, an associate professor and director of the BSW program in NC State’s Department of Social Work. Her practice experience includes medical social work, individual and group counseling with survivors of sexual assault and domestic violence, as well as program development, administration, and community outreach. Ames has taught workshops on how to write clearly for national, regional, and statewide audiences. This post first appeared on the Oxford University Press’s OUP blog.

Meyer’s law is too often embodied in academic and professional writing: “It is a simple task to make things complex, but a complex task to make them simple.”

Completing multitudinous years of education presumably encourages people to juxtapose one esoteric word after another in order to fabricate convoluted paragraphs formulated of impressively, extensively elongated and erudite sentences. To put it another way: completing many years of education encourages people to write complex paragraphs full of long sentences composed of long words.

What we may not do is consider whether the audiences for our writing will be willing and able to read and understand what we write. In other words, aim for readability. The first step is to identify what your audience needs to know. The next step is to incorporate principles that enable you to tell your audience what they need to know clearly, simply, and concisely.

In reality, most of us are both creators and recipients of needlessly complicated prose. Take, for example, the consent forms we sign for medical procedures. Do you read them? Do you understand them? What about the lengthy, convoluted online user agreements we’re asked to accept? Have you ever read one from top to bottom, or do you scroll though and hope for the best when you check the box at the end? We tend to view such documents as complicated for legal reasons, but in actuality, entities including the American Bar Association and the federal government actively advocate using plain language.

There is widespread agreement in the health professions on the need to simplify the language used to communicate health information. There are pleas for business and industry to simplify the language they use to market products. And in academia, clear writing is certainly not the norm. As an academic, I agree with Harvard’s Steven Pinker that the transmission of knowledge would benefit greatly from trading “academese” for plain English.

Those of us who write for audiences of professionals outside our disciplines, or for non-academic audiences including the general public, may not even realize how often we use professional jargon and acronyms that are unfamiliar or indecipherable to others. As a social worker, I know that TANF is the acronym for Temporary Assistance for Needy Families, and SNAP stands for Supplemental Nutrition Assistance Program. If you’re not a social worker, you may know these programs as “welfare” and “food stamps.”

Like other professionals, social workers also use plenty of jargon including terms such as empowerment, intimate partners, at-risk youth, and strengths-based assessment. I’ll leave you to puzzle over the meaning of those terms and then to consider how often you use your profession’s acronyms and jargon in place of more commonly understood language. If you’re in the habit of doing that, there’s a good chance you are confusing, rather than informing, people who read what you write.

Last, but not least, we may feel compelled to demonstrate how much we know by presenting ideas in technical terms, using three or four words when one would do, or choosing obscure words (particularly if they contain many syllables) over more easily recognizable ones. When you want to communicate to those outside your profession or discipline, consider the potential benefits of presenting information in plain language, “…clear, straightforward expression, using only as many words as are necessary” (Eagleson, 1990). That will increase your chances of reaching readers who know less than you do and who might benefit from the information you’re trying to convey.

This is not a plea for dumbing down, that ugly term some people use to describe easy-to-read written material. It is possible to convey ideas and information in writing that is pleasurable and easy for most people to read and comprehend. Personally, I’ve never heard anyone complain about writing that was too easy to understand or wish they’d had to expend more time and effort to understand something they just read. If we don’t seek to baffle and bewilder our readers, it should be worth investing a bit of time and effort in putting our messages across clearly and concisely. Why bother otherwise?

Tracey Peake <![CDATA[Researchers Tweak Enzyme ‘Assembly Line’ to Improve Antibiotics]]> 2016-12-05T15:39:02Z 2016-11-29T19:04:25Z Robotic workstations of an enzyme assembly line selecting building blocks to synthesize antibiotic scaffolds. Image Credit: Gavin Williams
Robotic workstations of an enzyme assembly line selecting building blocks to synthesize antibiotic scaffolds. Image Credit: Gavin Williams

Researchers from North Carolina State University have discovered a way to make pinpoint changes to an enzyme-driven “assembly line” that will enable scientists to improve or change the properties of existing antibiotics as well as create designer compounds. The work is the first to efficiently manipulate which building blocks the enzyme selects in the act of synthesizing erythromycin, an important antibiotic.

Many antibiotics are synthesized by huge sets of enzymes called polyketide synthases: a series of proteins arranged in a particular order that recruit specific small molecule building blocks to assemble the drug of interest. Picture an automobile assembly line – a car is assembled sequentially, using various interchangeable parts as it moves from one workstation to the next in line. Drug synthesis via polyketide synthases works in much the same way. Each protein module acts as a workstation responsible for selecting and adding another specific building block to the antibiotic.

Improving upon existing antibiotics is an efficient way to create new drugs in terms of both time and cost. If researchers could manipulate the function of each module in the enzyme assembly line, it would allow them to design man-made molecules, thus fine-tuning the pharmacological properties of a drug. However, no one had been able to discover how to make small tweaks to an enzyme module in order to completely change which building blocks are selected during the assembly process.

Gavin Williams, associate professor of bio-organic chemistry at NC State and corresponding author of a paper describing the work, along with former Ph.D. student and first author Irina Koryakina, former Ph.D student John McArthur and current Ph.D. student Christian Kasey, set out to discover how to make an enzyme module in erythromycin select a man-made building block.

The group looked at a protein module designated Ery6, which is the sixth module, or workstation on the assembly line, for erythromycin. They found that the module identifies and installs a methyl group building block. By genetically altering the active site of the enzyme they were able to make changes to single amino acids in that area, engineering the enzyme to reject the naturally occurring substrate, or building block, that it normally selected in favor of a man-made substrate the researchers preferred.

“We have engineered the incorporation of a non-natural substrate by changing the building block specificity of Ery6,” Kasey says. “Because we have completely changed the specificity only at the sixth module, we know that the chemistry at a defined part of the molecule will be changed. So if we move to a scenario where these compounds are generated in environments that may have both the natural and man-made substrates available to them, we know that only module six will select the substrate we want.”

“Previously, most changes to these compounds have been in the form of swapping out an entire enzyme module, rather than tweaking functionality within it,” Williams says. “Instead of a hatchet, our method is more surgical, making small but impactful changes to the module that won’t change its overall function while allowing us to fine tune the portions of the compound that we select.

“We want to apply this same approach to alter other groups in the structure so that we could diversify and modify other properties of the antibiotic. We believe that this approach will prove a powerful tool in constructing new designer compounds with pinpoint accuracy.”

The work appears in ACS Chemical Biology and was funded in part by the National Science Foundation (CHE-1151299) and the National Institutes of Health (GM104258). Koryakina is currently at Intrexon Corporation, and McArthur is currently at the University of California, Davis. Joseph Chemler, Shasha Li, Douglas Hansen and David Sherman from the University of Michigan, Ann Arbor also contributed to the work.


Note to editors: An abstract of the paper follows.

“Inversion of extender unit selectivity in the erythromycin polyketide synthase by acyltransferase domain engineering”


Authors: Irina Koryakina, NC State and Intrexon Corp.; John McArthur, NC State and UC-Davis; Christian Kasey and Gavin Williams, NC State University; Andrew Lowell, Joseph Chemler, Shasha Li, Douglas Hansen and David Sherman, University of Michigan Ann Arbor
Published: ACS Chemical Biology

Acyltransferase (AT) domains of polyketide synthases (PKSs) select extender units for incorporation into polyketides and dictate large portions of the structures of clinically relevant natural products. Accordingly, there is significant interest in engineering the substrate specificity of PKS ATs in order to site-selectively manipulate polyketide structure. However, previous attempts to engineer ATs have yielded mutant PKSs with relaxed extender unit specificity, rather than an inversion of selectivity from one substrate to another. Here, by directly screening the extender unit selectivity of mutants from active site saturation libraries of an AT from the prototypical PKS, 6- deoxyerythronolide B synthase, a set of single amino acid substitutions was discovered that dramatically impact the selectivity of the PKS with only modest reductions of product particular substitution (Tyr189Arg) inverted the selectivity of the wild-type PKS from its natural substrate towards a non-natural alkynyl-modified extender unit while maintaining more than twice the activity of the wild-type PKS with its natural substrate. The strategy and mutations described herein form a platform for combinatorial biosynthesis of site-selectively modified polyketide analogues that are modified with nonnatural and non-native chemical functionality.

Tracey Peake <![CDATA[New Fabrication Technique Leads to Broader Sunlight Absorption in Plastic Solar Cells]]> 2016-12-05T15:39:12Z 2016-11-29T15:22:50Z Researchers from North Carolina State University have developed a new strategy for fabricating more efficient plastic solar cells. The work has implications for developing solar cells with a wider absorption range and increased efficiency.

As plastic solar cells now rival silicon-based solar cells in power conversion efficiency, researchers want to increase the range of photonic energies that plastic solar cells absorb. Ternary solar cells, in which three materials are mixed together as a light-harvesting layer, offer a potential solution. However, while ternary solar cells have been manufactured for years, most of the devices have not been able to meet desired levels of performance ­– mainly due to unfavorable mixing.

Masoud Ghasemi, a graduate student in physics at NC State and lead author of a paper describing the research, worked with a team of other NC State physicists led by Harald Ade and chemists from the University of North Carolina at Chapel Hill led by Wei You to identify a way to solve the production problem.

The team proposed a calorimetric tool to study the morphology of a ternary system with two absorption-matched donor polymers and a fullerene acceptor. When fabricated by the traditional method – which involves mixing all three materials together and then depositing them onto a substrate – the system gave poor device performance.

“Using thermodynamic techniques, we were able to find that this particular mixture was undergoing ‘alloying,’ in which the donor polymers tend to group up together and push the fullerene away,” Ghasemi says. “This explains why so many conventionally produced ternary cells may have low efficiency.”

The team decided to solve the alloying problem by mixing each polymer separately with the fullerene, rather than mixing all three materials together at once. They created two distinct mixtures which were layered onto the substrate, creating sequentially cast ternary (SeCaT) solar cells, which did not fall prey to alloying.

“The SeCaT solar cells prevent the polymers from mixing due to their layered structure,” Ghasemi says. “This novel design allows fabrication of plastic solar cells with wider optical sensitivity using cheap and scalable processing steps and with reduced materials selection constraints. Hopefully this new method can be particularly useful for greenhouse applications toward zero energy farming, as the materials used to demonstrate our method have optical properties compatible to these applications.”

The work appears in Advanced Materials. Harald Ade, professor of physics at NC State, is corresponding author. NC State assistant research professor Abay Gadisa, postdoctoral scholars Long Ye, Joo-Hyun Kim and Omar Awartani, as well as UNC-CH postdoctoral scholar Liang Yan, graduate student Qianqian Zhang, and associate professor of chemistry Wei You, also contributed to the work. The research was funded by the Office of Naval Research grant N000141512322.


Note to editors: An abstract of the paper is below

“Panchromatic Sequentially-Cast Ternary Polymer Solar Cells”

DOI: 10.1002/adma.201604603

Authors: Masoud Ghasemi, Long Ye, Joo-Hyun Kim, Omar Awartani, Abay Gadisa, Harald Ade, North Carolina State University; Liang Yan, Qianqian Zhang, Wei You, University of North Carolina at Chapel Hill
: Advanced Materials

Abstract: Sequentially cast ternary method is developed to creat stratified bulk heterojunction (BHJ) solar cells, in which the two BHJ layers are spin cast sequentially without the need of adopting middle electrode and orthogonal solvents. This method is found to be particularly useful for the polymers that form mechanically alloyed morphology due to high degree of miscibility in the blend.

Matt Shipman <![CDATA[For Nonprofits, Even Non-Finance ‘Capacity Grants’ Stimulate Financial Growth]]> 2016-11-29T15:01:20Z 2016-11-29T15:01:20Z Research from North Carolina State University and American University finds that so-called “capacity grants” lead to long-term financial growth for nonprofit organizations – regardless of what the grants are for. However, there is no added benefit from capacity grants that focus specifically on financial growth.

Many foundations provide nonprofit organizations with grant funding that can be used to “build capacity,” making the organizations stronger and more capable. In practice, these grants focus on anything from expanded personnel training to improving financial practices.

“We wanted to know if these ‘capacity building’ grants actually improve outcomes for organizations that receive the grants,” says Amanda Stewart, an assistant professor of public administration at NC State and lead author of a paper describing the work.

“Foundations put a lot of money into these capacity grants, and usually get reports from grant recipients six or 12 months later,” Stewart says. “But we had very little information on the long-term impacts of these grants, so that’s what we chose to look at.”

To address the issue, the researchers partnered with a large foundation. The researchers looked at grant proposals, both capacity-specific and for more general programmatic needs, that were submitted to the foundation by more than 400 nonprofit organizations, spanning a 12-year period. The researchers then assessed the financial trends for each organization for three years following the year of each grant decision.

The researchers compared the financial trends of nonprofits that did not receive capacity grants to the data from nonprofits that did receive capacity grants, paying particular attention to three things: the long-term financial impact of receiving any kind of grant; the impact of receiving any capacity-building grant; and the impact of receiving a capacity-building grant that focused specifically on building financial capacity. Financial capacity grants funded actions such as hiring a grant writer or developing an in-depth fundraising plan.

“We found that 184 of the nonprofits received capacity-building grants, and that receiving any capacity grant was associated with financial growth for a nonprofit,” Stewart says. “But we also found that receiving a financial capacity grant did not boost financial growth more than receiving any other kind of capacity grant.”

Organizations that received any capacity grant grew by around 10 percent in the three years following the grant. Grants that specifically targeted financial capacity development did not lead to greater long-term financial outcomes than grants that focused on other management or governance issues.

“This tells us that capacity grants are effective investments in nonprofit organizations, but the benefits are not necessarily as targeted as one might expect,” Stewart says.

“We think that one benefit of these grants may stem from receiving the explicit imprimatur of a foundation,” Stewart adds. “In other words, receiving a capacity grant from the foundation may serve as a seal of approval that makes the nonprofit organization more attractive to other foundations and donors.”

The paper, “As you sow, so shall you reap? Evaluating if targeted capacity building improves nonprofit financial growth,” is published in the journal Nonprofit Management and Leadership. The paper was co-authored by Lewis Faulk of American University.


Note to Editors: The study abstract follows.

“As you sow, so shall you reap? Evaluating if targeted capacity building improves nonprofit financial growth”

Authors: Amanda Stewart, North Carolina State University; Lewis Faulk, American University

Published: Nov. 11, Nonprofit Management and Leadership

DOI: 10.1002/nml.21247

Abstract: Foundations’ capacity building grant programs strive to bolster performance and outcomes for their nonprofit grantees. Yet with few outcome evaluations of such programs, we have limited understanding if these capacity building efforts achieve their intended result. This study evaluates fifteen years of data for one foundation’s capacity building grant program to understand if targeted capacity building for financial management and development contributes to nonprofit financial growth. The analysis examines the management-performance link in this context and informs sector leaders who dedicate resources to capacity building programs about the outcomes of these efforts.

Matt Shipman <![CDATA[Study Finds NC Coastal Officials Eschew Climate Planning Until They See Damage]]> 2016-11-29T14:22:57Z 2016-11-29T14:22:57Z When is the best time to start planning for an emergency? Is it better to get a head start, or wait until a problem manifests? A recent study finds that local officials in coastal North Carolina are unlikely to plan for the effects of climate change until they perceive a threat to their specific communities.

“If public officials are supposed to be planning for the future, waiting until a threat is apparent really limits the amount of lead time that public agencies have to prepare,” says Brian Bulla, lead author of a paper on the study.

“We found that many public officials need to see damage before they’re willing to act,” says Bulla, an assistant professor at Appalachian State University who did the study while a Ph.D. student at NC State.

The study was designed to see what characteristics made public officials more likely to embrace the idea of “adaptive decision making” in regard to climate change. Adaptive decision making refers to efforts to put policies in place that account for the anticipated impact of climate change, such as modifying infrastructure.

For the study, researchers surveyed 88 local government officials across all 20 of North Carolina’s coastal counties. They found the variable that most closely correlated with willingness to pursue adaptive action was the extent to which an official perceived climate change as a threat to his or her community. Political ideology showed a significantly less robust correlation, and the extent to which officials professed knowledge about climate change was not correlated at all.

“The finding highlights the need to use information about the likelihood of specific climate impacts in communities when communicating with local officials,” says NC State’s Elizabeth Craig, who co-authored the paper.

The paper, “Climate change and adaptive decision making: Responses from North Carolina coastal officials,” is published online in the journal Ocean & Coastal Management.

Tracey Peake <![CDATA[Toxic ‘Marine Snow’ Can Sink Quickly, Persist at Ocean Depths]]> 2016-11-28T16:37:16Z 2016-11-28T16:33:24Z In a new study, researchers from North Carolina State University found that a specific neurotoxin can persist and accumulate in “marine snow” formed by the algae Pseudo-nitzschia, and that this marine snow can reach significant depths quickly. These findings have implications for food safety policies in areas affected by toxic marine algal blooms.

When algae cells run out of nutrients and start to die, they clump together and sink as marine snow. The algae and its marine snow aggregates can serve as a major food source for other forms of marine life like plankton-eating fish and shellfish. Pseudo-nitzschia is a microscopic algae that occurs naturally in coastal waters, and is of particular concern due to its production of the neurotoxin domoic acid. When domoic acid-containing Pseudo-nitzschia enter the food chain, humans can accidentally consume it via shellfish. This type of shellfish poisoning, known as amnesic shellfish poisoning, can cause neurological and gastrointestinal symptoms ranging from short-term memory loss to – in rare cases – death.

Astrid Schnetzer, associate professor of marine, earth and atmospheric sciences at NC State, wanted to know how domoic acid gets transported to depth via marine snow after a toxic algal bloom and how long it may persist. In a previous study she showed that marine snow can reach depths of several hundred meters within a few days, which contradicted previous theories suggesting that it might dissipate and dissolve long before reaching the ocean floor.

“Recent large toxic blooms off of the California coast and the attendant damage to local shellfish and the shellfish economy underscore the importance of understanding how long the marine snow remains toxic, how deep it can go and how long marine organisms are exposed to the toxin,” Schnetzer says. “The fact that high levels of domoic acid can be found in marine life months after a bloom demonstrate the need for deciphering the mechanisms by which domoic acid reaches the seafloor.”

Schnetzer and colleagues created their own toxic algal bloom in the lab using P. australis algae, one of the most toxic Pseudo-nitzschia species and one that blooms along the U.S. West Coast. They found that after two weeks, toxic marine snow from this algae could sink at rates of over 100 meters per day. Domoic acid did not dissipate appreciably during the sinking period, retaining up to 80 percent of its original toxicity.

“This study confirms that marine snow is a major vector in terms of getting domoic acid to depth,” Schnetzer says. “Our future work will focus on the ways in which smaller organisms that feed on marine snow may be affected by the toxicity, and how that in turn can affect the larger food web.”

The research appears in Harmful Algae. The work was funded by National Science Foundation grants 1459406 and 0850425 and North Carolina Sea Grant NA10OAR1040080. Schnetzer is corresponding author. NC State’s Christopher Osburn, NC State and University of North Carolina at Chapel Hill’s Robert Lampe, UNC-Chapel Hill’s Adrian Marchetti, University of South Carolina’s Claudia Benitez-Nelson and University of Southern California Los Angeles’ Avery Tatters contributed to the work.


Note to editors: An abstract of the work follows.

“Marine snow formation by the toxin-producing diatom, Pseudo-nitzschia australis

DOI:  10.1016/j.hal.2016.11.008

Authors: Astrid Schnetzer, Robert Lampe, Chris Osburn, NC State University; Adrian Marchetti and Robert Lampe, UNC-Chapel Hill; Claudia Benitez-Nelson, University of South Carolina; Avery Tatters, UCLA
Published: Harmful Algae

The formation of marine snow (MS) by the toxic diatom Pseudo-nitschia australis was simulated using a roller table experiment. Concentrations of particulate and dissolved domoic acid (pDA and dDA) differed significantly among exponential phase and MS formation under simulated near surface conditions (16ºC/ 12:12-dark:light cycle) and also differed compared to subsequent particle decomposition at 4ºC in the dark, mimicking conditions in deeper waters. Particulate DA was first detected at the onset of exponential growth, reached maximum levels associated with MS aggregates (1.21 ± 0.24 ng mL-1) and declined at an average loss rate of ~1.2% pDA day-1 during particle decomposition. Dissolved DA concentrations increased throughout the experiment and reached a maximum of ~20 ng mL-1 at the final time point on day 88. The succession by P. australis from active growth to aggregation resulted in toxic MS and based on DA loading of particles and known in situ sinking speeds, a significant amount of toxin could have easily reached the deeper ocean or seafloor. MS formation was further associated with significant dDA build up at a ratio of pDA : dDA : cumulative dDA of approximately 1:10:100. Overall, this study confirms MS functions as a major vector for toxin flux to depth, that Pseudo-nitzschia-derived aggregates should be considered ‘toxic snow’ for MS-associated organisms, and that effects of MS toxicity on interactions with aggregate-associated microbes and zooplankton consumers warrant further consideration.

Matt Shipman <![CDATA[Smart Patch Releases Blood Thinners As Needed, Prevents Thrombosis in Animal Model]]> 2016-11-29T15:27:06Z 2016-11-28T16:11:18Z An interdisciplinary team of researchers has developed a smart patch designed to monitor a patient’s blood and release blood-thinning drugs as needed to prevent the occurrence of dangerous blood clots – a condition known as thrombosis. In an animal model, the patch was shown to be more effective at preventing thrombosis than traditional methods of drug delivery. The work was done by researchers at North Carolina State University and the University of North Carolina at Chapel Hill.

Thrombosis occurs when blood clots disrupt the normal flow of blood in the body, which can cause severe health problems such as pulmonary embolism, heart attack or stroke. Current treatments often rely on the use of blood thinners, such as Heparin, which require patients to test their blood on a regular basis in order to ensure proper dosages. Too large a dose can cause problems such as spontaneous hemorrhaging, while doses that are too small may not be able to prevent a relapse of thrombosis.

“Our goal was to generate a patch that can monitor a patient’s blood and release additional drugs when necessary; effectively, a self-regulating system,” says Zhen Gu, co-corresponding author on a paper describing the work. Gu is an associate professor in the joint biomedical engineering program at NC State and UNC.

“Two years ago, I spoke with Zhen Gu about the significant clinical need for precise delivery of blood thinners,” says Caterina Gallippi, a co-corresponding author and associate professor in the joint biomedical engineering program. “We, together with Professor Yong Zhu in the mechanical engineering department at NC State, assembled a research team and invented this patch.”

The thrombin-responsive microneedle patch is made of heparin-modified hyaluronic acid. Image courtesy of Yuqi Zhang.
The thrombin-responsive microneedle patch is made of heparin-modified hyaluronic acid. Image courtesy of Yuqi Zhang.

The patch incorporates microneedles made of a polymer that consists of hyaluronic acid (HA) and the drug Heparin. The polymer has been modified to be responsive to thrombin, an enzyme that initiates clotting in the blood.

When elevated levels of thrombin enzymes in the bloodstream come into contact with the microneedle, the enzymes break the specific amino acid chains that bind the Heparin to the HA, releasing the Heparin into the blood stream.

“The more thrombin there is in the bloodstream, the more Heparin is needed to reduce clotting,” says Yuqi Zhang, a Ph.D. student in Gu’s lab and co-lead author of the paper. “So we created a disposable patch in which the more thrombin there is in the blood stream, the more Heparin is released.”

“We will further enhance the loading amount of drug in the patch. The amount of Heparin in a patch can be tailored to a patient’s specific needs and replaced daily, or less often, as needed,” says Jicheng Yu, a Ph.D. student in Gu’s lab and the other co-lead author of the paper. “But the amount of Heparin being released into the patient at any given moment will be determined by the thrombin levels in the patient’s blood.”

The research team tested the HA-Heparin smart patch in a mouse model. In the experiments, subjects were injected with large doses of thrombin, which would result in fatal blood clotting of the lungs if left untreated.

In the first experiment, mice were either left untreated, given a shot of Heparin, or given the HA-Heparin smart patch. The mice were injected with thrombin 10 minutes later. Fifteen minutes after the thrombin injection, only the mice who received no treatment died.

In the second experiment, the thrombin was injected six hours after treatment. Fifteen minutes after the thrombin injection, all of the mice with the HA-Heparin smart patch were fine, but around 80 percent of the mice that received the Heparin shot had died.

“We’re excited about the possibility of using a closed-loop, self-regulating smart patch to help treat a condition that affects thousands of people every year, while hopefully also driving down treatment costs,” Gu says. “This paper represents a good first step, and we’re now looking for funding to perform additional preclinical testing.”

The paper, “Thrombin-Responsive Transcutaneous Patch for Auto-Anticoagulant Regulation,” is published in the journal Advanced Materials. The paper’s co-corresponding authors are Zhen Gu and Caterina Gallippi, associate professors in the joint biomedical engineering program at NC State and UNC; and Yong Zhu, an associate professor of mechanical and aerospace engineering at NC State. The paper was co-authored by Yuqi Zhang, Jicheng Yu, Jinqiang Wang, Nicholas Hanne, Chenggen Qian, Chao Wang, Hongliang Xin and Jacqueline Cole, of the joint biomedical program at NC State and UNC; and Zheng Cui of NC State.

The work was supported by the Alfred P. Sloan Foundation; NC TraCS, NIH’s Clinical and Translational Science Awards, under grant 1UL1TR001111; and the National Science Foundation through the ASSIST Engineering Research Center at NC State (EEC-1160483) and grant EFRI-1240438.


Note to Editors: Study details follow.

“Thrombin-Responsive Transcutaneous Patch for Auto-Anticoagulant Regulation”

Authors: Yuqi Zhang, Jicheng Yu, Jinqiang Wang, Nicholas J. Hanne, Chenggen Qian, Chao Wang, Hongliang Xin, Jacqueline H. Cole, Caterina M. Galippi and Zhen Gu, North Carolina State University and the University of North Carolina at Chapel Hill; Zheng Cui and Yong Zhu, North Carolina State University

Published: Nov. 25, 2016 in Advanced Materials

DOI: 10.1002/adma.201604043

Matt Shipman <![CDATA[Cooking Stuffing This Holiday? Here’s a Simple Way to Help Ward Off Foodborne Illness]]> 2016-11-22T20:51:40Z 2016-11-22T18:44:08Z Editor’s Note: This is a guest post from Ben Chapman, a food safety researcher and holiday meal enthusiast. He has some food safety tips to help you avoid making loved ones sick this holiday season — because nothing ruins a get-together like projectile vomiting.

As a Canadian in the U.S. I’ve fully embraced the holiday season that runs from Thanksgiving through December. I enjoy spending a day planning and shopping for an event-style meal, and then another day actually preparing and cooking it. I throw on some tunes (this year it will probably be Drake, for my Canadian roots, and the Avett Brothers as a nod to North Carolina) and with the help of the rest of the family I’ll roast a turkey, make mashed potatoes, green beans, squash, beets and a couple of other harvest vegetables.

And we’ll make a lot of stuffing.

Depending on your preference and food persuasion there are lots of different stuffing or dressing options.

A common question that pops up is whether it’s better to cook stuffing in the bird to preserve moisture (and get flavored by the turkey juices) or to prepare it as a separate dish. The concern is that if someone puts the stuffing in the turkey cavity it may become contaminated by the turkey juices and Salmonella and Campylobacter will migrate through the stuffing. Easier to recommend not messing with the cross-contamination instead of managing the risk. But what does the science say?

I’m a food safety nerd and take a science-based approached to my meals. Armed with a digital, tip-sensitive thermometer I’m happy to jam stuffing up inside of my poultry and use the probe to check the temperature. And I use 165 degrees Fahrenheit as a target for my bread-based stuffing.

There’s some history to that number; in 1958 Raymond Rogers and Millard Gunderson of the Campbell Soup Company published some work evaluating the safety of roasting frozen stuffed turkeys (a new product at the time). Using a known amount of Salmonella pullorum, nine turkeys and some then-fancy ceramic thermocouples, they found that they could get an 8-log (or 99.999999%) reduction when the deepest part of the stuffing hit 160 degrees Fahrenheit. They recommended 165 degrees to be conservative (and because some thermometers aren’t always very accurate).

From the manuscript (comments that still apply today): “The initial temperature and the size of the turkey influence considerably the time required to reach a lethal temperature in the stuffing. The lower the initial temperature of the turkey, the longer the roasting period required. Present recommended roasting procedures designating hours cooking time or which stipulate a thigh or breast temperature to be attained alone does not appear to be adequate bacteriologically.”

Inside the bird, outside the bird; meat or no meat: Use a thermometer.

Note: Chapman has also made an entire video devoted to minimizing risk from foodborne illness when cooking turkey. More food safety tips from Chapman are available here.

Tracey Peake <![CDATA[Keratin and Melanosomes Preserved in 130-Million-Year-Old Bird Fossil]]> 2016-12-05T15:39:23Z 2016-11-21T20:15:54Z Eoconfuciusornis Image Credit: Dr. Xiaoli Wang
Eoconfuciusornis Image Credit: Dr. Xiaoli Wang

New research from North Carolina State University, the Chinese Academy of Sciences and Linyi University has found evidence of original keratin and melanosome preservation in a 130-million-year-old Eoconfuciusornis specimen. The work extends the timeframe in which original molecules may preserve, and demonstrates the ability to distinguish between ancient microstructures in fossils.

Eoconfuciusornis, crow-sized primitive birds that lived in what is now China around 130 million years ago, are the earliest birds to have a keratinous beak and no teeth, like modern birds. Previous studies argued that the feathers of these and other ancient birds and dinosaurs preserved small, round structures interpreted to be melanosomes – pigment-containing organelles that, along with other pigments, give feathers their color. However, without additional evidence, it was not possible to prove that these structures weren’t just microbes that had coated the feather during decomposition and fossilization.

Yanhong Pan, associate research fellow at the Chinese Academy of Sciences and corresponding author of a paper describing the research and co-author Mary Schweitzer, NC State professor of biology with a joint appointment at the North Carolina Museum of Natural Sciences, examined feathers from an Eoconfuciusornis specimen taken from the Jehol Biota site in northern China, which is renowned for excellent fossil preservation.

“If these small bodies are melanosomes, they should be embedded in a keratinous matrix, since feathers contain beta-keratin,” Schweitzer says. “If we couldn’t find the keratin, then those structures could as easily be microbes, or a mix of microbes and melanosomes – in either case, predictions of dinosaur shading would not be accurate.”

Pan, Schweitzer and their team used both scanning and transmission electron microscopy to get microscopic details of the feather’s surface and its internal structure. They also utilized immunogold labeling – in which gold particles are attached to antibodies that bind to particular proteins in order to make them visible in electron microscopy – to show that filaments within the feathers were keratin.

Finally, they mapped copper and sulfur to these feathers at high resolution. Sulfur was broadly distributed, reflecting its presence in both keratin and melanin molecules in modern feathers. However copper, which is only found in modern melanosomes, and not part of keratin, was only observed in the fossil melanosomes. These findings both support the identity of the melanosomes and indicate that there was no mixing or leaching during decomposition and fossilization.

“This study is the first to demonstrate evidence for both keratin and melanosomes, using structural, chemical and molecular methods,” Pan says. “These methods have the potential to help us understand – on the molecular level – how and why feathers evolved in these lineages.”

The work appears in Proceedings of the National Academy of Sciences. The research was supported in part by the National Science Foundation (EAR-1344198), the David and Lucille Packard Foundation, and the National Natural Science Foundation of China. NC State’s Wenxia Zheng, Elena Schroeter and Alison Moyer (now at Drexel University), the Chinese Academy of Sciences’ Zhonghe Zhou, Jingmai K. O’Connor and Min Wang, and Linyi University’s Xiaoting Zheng and Xiaoli Wang contributed to the work.


Note to editors: An abstract of the paper follows

“Molecular evidence of keratin and melanosomes in feathers of the Early Cretaceous bird Eoconfuciusornis”    

DOI:  10.1073/pnas.1617168113

Authors: Yanhong Pan, Zhonghe Zhou, Jingmai K. O’Connor and Min Wang, Chinese Academy of Sciences; Mary Schweitzer, Wenxia Zheng and Elena Schroeter, NC State University; Alison Moyer, NC State and Drexel University; Xiaoting Zheng and Xiaoli Wang, Linyi University
Published: in Proceedings of the National Academy of Sciences

Microbodies associated with feathers of both non-avian dinosaurs and early birds were first identified as bacteria, but have been reinterpreted as melanosomes. While melanosomes in modern feathers are always surrounded by and embedded in keratin, the preservation of melanosomes embedded in keratin in fossils has not been previously demonstrated. Here, we provide multiple independent molecular analyses of both microbodies and the associated matrix recovered from feathers of a new specimen of the basal bird Eoconfuciusornis from the Early Cretaceous Jehol Biota of China. Our work represents the oldest ultrastructural and immunological recognition of avian beta-keratin from an Early Cretaceous (~130 Ma) bird. Furthermore, for the first time, we apply immunogold to identify protein epitopes at high resolution, by localizing antibody-antigen complexes to specific fossil ultrastructures. Retention of original keratinous proteins in the matrix surrounding electron-opaque microbodies supports their assignment as melanosomes and adds to the criteria employable to distinguish melanosomes from microbial bodies. Our work sheds new light on molecular preservation within normally labile tissues preserved in ancient fossils.

Tracey Peake <![CDATA[New Treatment for Allergic Response Targets Mast Cells]]> 2016-11-18T20:48:00Z 2016-11-21T20:15:27Z Researchers from North Carolina State University and the National Institutes of Health (NIH) have developed a method that stops allergic reactions by removing a key receptor from mast cells and basophils. Their work has implications for the treatment of skin allergies and asthma.

Allergic reactions are driven by mast cells and basophils – types of inflammatory cells found in tissues and the bloodstream, respectively, that function as part of our immune system. When you come into contact with an allergen – ragweed, for example – immunoglobulin E (IgE) specific to that allergen acts through its receptor on the mast cell, stimulating the mast cells and basophils to release mediators, such as histamine, that trigger an allergic response.

Currently, most allergy treatments focus on either stopping the effects of histamine and other mediators or on dampening the body’s overall immune response by use of steroids. Unfortunately neither approach is totally effective, and in the case of dampening the immune response, can have significant drawbacks.

Glenn Cruse, formerly of the NIH, current assistant professor of immunology at NC State and corresponding author of a paper describing the work, along with NIH colleague and co-author Dean Metcalfe, wanted to try and block the allergic reaction at its source.

Cruse and Metcalfe looked at a gene called MS4A2, which is only expressed in mast cells and basophils, and is responsible for forming the IgE receptor on the mast cell.

The researchers utilized a technique known as exon skipping, a form of RNA splicing, to eliminate the portion of one of the IgE receptor gene’s mRNA that is essential to making a protein which places the IgE receptor on the mast cell surface. The cell’s DNA remains unaffected. When the exon targeting therapy is stopped, the protein blocked by exon skipping is made again.

Cruse and Metcalfe tested their therapy on mast cells in vitro – where it eliminated activation of mast cells by allergen – and against allergic dermatitis in vivo, using a mouse model. Their results in vivo showed a marked reduction in the allergic dermatitis response in the mice.

“Asthma and allergic diseases affect up to 20 percent of people in developed countries, and their prevalence is increasing,” says Cruse. “By eliminating the expression of the IgE receptor on the surface of mast cells, we have identified an innovative and targeted approach with the potential to treat allergic inflammation in millions of patients worldwide.

“Due to the specificity of our approach for mast cells and basophils, it should have significant advantages over current therapies. However, it is important to note that while our findings are very promising, we are still in the early stages of developing this therapeutic approach.”

The research appears in Proceedings of the National Academy of Sciences. Funding was provided by the NIH Division of Intramural Research of NIAID and NHLBI.


Note to editors: An abstract of the paper follows

“Exon skipping of FcεRIβ eliminates expression of the high affinity IgE receptor in mast cells with therapeutic potential for allergy”

DOI:  10.1073/pnas.1608520113

Authors: Glenn Cruse, North Carolina State University and National Institutes of Health; Tomoki Fukuyama, Greer K. Arthur, Wolfgang Bäumer, North Carolina State University; Yuzhi Yin, Avanti Desai, Michael A. Beaven and Dean D. Metcalfe, National Institutes of Health

Published: Proceedings of the National Academy of Sciences

Allergic diseases are driven by activation of mast cells and release of mediators in response to IgE-directed antigens. However, there are no drugs currently available that can specifically down-regulate mast cell function in vivo when chronically administered. Here, we describe an innovative approach for targeting mast cells in vitro and in vivo using antisense oligonucleotide-mediated exon skipping of the β-subunit of the high affinity IgE receptor (FcεRIβ) to eliminate surface high affinity IgE receptor (FcεRI) expression and function, rendering mast cells unresponsive to IgE-mediated activation. As FcεRIβ expression is restricted to mast cells and basophils, this approach would selectively target these cell types. Given the success of exon skipping in clinical trials to treat genetic diseases such as Duchenne muscular dystrophy, we propose that exon skipping of FcεRIβ is a potential approach for mast cell-specific treatment of allergic diseases.

Alastair Hadden <![CDATA[A Vaccine for Zika]]> 2016-11-21T18:41:22Z 2016-11-21T18:41:22Z 0 Tracey Peake <![CDATA[AAAS Fellows Named]]> 2016-12-01T15:23:13Z 2016-11-21T16:15:16Z North Carolina State University faculty members have been elected as fellows of the American Association for the Advancement of Science (AAAS).

John Michael Blondin, Alumni Distinguished Undergraduate Professor of Physics and associate dean for research in the College of Sciences, was elected for distinguished contributions to astrophysics and physics education, particularly in supernovae and supernova remnants.

Steven D. Clouse, professor emeritus of genomics, plant physiology and horticultural science, was elected for distinguished contributions to the field of plant biology, particularly for pioneering studies of brassinosteroid signaling and plant receptor kinases. Clouse, who retired from NC State in 2015, is currently a program director in the Division of Molecular and Cellular Biosciences at the National Science Foundation in Arlington, Virginia.

Anastasios A. Tsiatis, Gertrude M. Cox Professor of Statistics, elected for distinguished contributions to survival analysis, statistical methods for clinical trials, causal inference and the broad fields of semiparametric models and dynamic treatment regimes.

They are among 391 scientists to be honored this year by AAAS, the world’s largest scientific society and publisher of the journal Science.

Each year, the AAAS Council – the policymaking body of the society – elects members who have shown “scientifically or socially distinguished efforts to advance science or its applications.” Fellows are nominated by their peers and undergo an extensive review process.

The NC State fellows will be recognized at the AAAS annual meeting in Boston, Massachusetts, in February.