SIGGRAPH ’19 Tackles Facial Animation, AI, Game Ethics & Real-Time Production

SIGGRAPH 2019 has detailed its Real-Time Live! and Games programming for the upcoming conference in L.A. (July 28-Aug. 1), which will shine a light on topics including facial animation, AI, ethics and the future of real-time production. In addition, “Il Divino: Michelangelo’s Sistine Ceiling in VR” will debut at the show, powered by Unreal Engine.

Real-Time Live! showcases how essential real-time technology is and continues to be in revolutionizing the way we interact in an increasingly connected society. Following a theme of “thrive on live,” the innovations that teams will demo are well-positioned to enhance future technological applications.

“We have a great lineup of demos prepared this year and I am so excited to share the work our jury culled from nearly 50 submissions,” said SIGGRAPH 2019 Real-Time Live! Chair Gracie Arenas Strittmatter. “I hope you’ll join us as we take a look back at the past decade of Real-Time Live! and journey into its promising future. Our contributors will share new applications for robotics, ray tracing, AI, VR, games, film, and much more.”

Taking the future of production further, Games programming at SIGGRAPH spans conference Talks, Panels, Computer Animation Festival, Appy Hour, Immersive Pavilion and more. The conference committee and reviewers work together to help spotlight creators, visionaries and pioneers within the games industry from the world’s leading studios. Content is presented through sessions and demos that spotlight the latest rendering techniques, real-time tricks, asset pipelines, facial animation, advances in hardware and software, production challenges and accomplishments, VR/AR and beyond.

SIGGRAPH 2019 Games Co-chair Christopher Evans noted, “At SIGGRAPH 2019, attendees can expect to immerse themselves in the most engaging new gaming technologies and techniques. In addition to being a key place to learn about the latest research in games, the speakers and experiences lined up this year shine a light on the ways the industry is thriving.”

Highlights include:

Real-Time, Single Camera, Digital Human Development
Contributors: Doug Roble, Darren Hendler, Jeremy Buttell, Lonnie Iannazzo, Melissa Cell, Deer Li, Jason Briggs, Chad Reddick, Mark Williams, Lucio Moser, Cydney Wong, Dimitry Kachkovski, Jason Huang, Kai Zhang, David McLean, Rickey Cloudsdale, Dan Milling, Ron Miller, JT Lawrence, and Chinyu Chien, Digital Domain

Digital Domain Virtual Human will demonstrate our state-of-the-art, real-time, live-drive, digital human technology. We will show how photo-real digital humans with realistic facial animation can change the VR/AR experience.

“Reality vs. Illusion” Real-Time Ray Tracing
Contributors: Natalie Burke, Arisa Scott, Natalya Tatarchuk, and Sebastien Lagarde, Unity Technologies
We seamlessly transition from the filmed real-world car to the car ray traced in real-time powered by Unity, demonstrating advanced effects of real-time ray tracing technology in a live demo – dynamic global reflections and GI, multi-layer transparency with refraction, area lights, shadows, ambient occlusion, and more.

Il Divino: Michelangelo’s Sistine Ceiling in VR
Contributors: Christopher Evans, Epic Games
Experience Michelangelo’s Sistine Chapel ceiling in virtual reality. Move around the space or take a guided tour. Users can also suspend on a virtual scaffold to see the work up close as few have seen it before. The experience will use the all-new Valve INDEX headset.

FACS at 40
Panelists: Mike Seymour, University of Sydney and fxguide (moderator); Erika Rosenberg, Stanford University and Erika Rosenberg Consulting; Vladimir Mastilovic, 3Lateral; Mark Sagar, Soul Machines; and, John Peter Lewis, Google AI

In 1978, research started in coding human expressions. The Facial Action Coding System (FACS) would become one of the cornerstones of facial animation. Forty years on, this panel discusses the strengths and limitations of FACS, its relevance in the age of AI, and what people are doing to improve it.

The Ethical and Privacy Implications of Mixed Reality
Panelists: Kent Bye, Voices of VR Podcast (moderator); Diane Hosfelt, Mozilla; Matt Miesnieks, 6D.AI; Samantha Mathews Chase, Venn.Agency; and, Taylor Beck, Magic Leap
The spatial computing affordances of virtual and augmented reality introduce all sorts of new ethical and privacy dilemmas, and this panel will be exploring best practices for navigating the biometric privacy, the AR Cloud, self-sovereign identity, and creating a framework for navigating the ethical and privacy implications of mixed reality.

THRIVE – Foundational Principles & Technologies for the Metaverse
Speaker: Tim Sweeney, Epic Games
Science-fiction notions of “the metaverse” are slowly becoming a reality as products such as Fortnite, Minecraft and Roblox bring immersive social experiences to hundreds of millions of people and blur the boundaries between games and social networks. This talk explores the foundational technologies, economies, and freedoms required to build this future medium as a force for good.

Access to Real-Time Live! and most Games programming requires a Select Conference badge or higher. Register for SIGGRAPH 2019, the 46th international conference and exhibition on computer graphics and interactive techniques, at

Sistine Chapel VR

Sistine Chapel VR

Fortnite Thanos

Fortnite Thanos

Animation Magazine

Bones Help Black People Keep Facial Aging at Bay

By Steven Reinberg

HealthDay Reporter

TUESDAY, June 11, 2019 (HealthDay News) — Why do so many black adults continue to look youthful as they age?

A new study says it’s in their bones.

Researchers found that the facial bones of black adults retain a higher mineral content than those other races, which makes their faces less likely to reflect their advancing years.

The new study is the first to document how facial bones change as black adults age, and may help guide plastic surgeons’ work.

“It is important for plastic surgeons to understand how the facial aging process differs among racial and ethnic groups to provide the best treatment,” said study author Dr. Boris Paskhover. He is an assistant professor at Rutgers New Jersey Medical School, in Newark.

For the study, his team looked at medical records of 20 black adults from 1973 and 2017. The study patients had at least two face scans taken 10 years apart.

Although all of the faces changed over time, they showed only minor changes, compared to similar studies on the aging white population.

“This finding reflects other studies that show black adults have higher bone mineral density, decreased rates of bone loss and lower rates of osteoporosis as compared to the general population,” Paskhover said in a university news release.

Facial aging results from a combination of changes to the skin, muscle, fat and bones.

As people age, the loss of mineral density causes bone loss. Bone loss can affect the shape of the nose, lower jowl area, cheekbones, and middle and lower areas of the eye sockets, the researchers explained.

“As bones change, they affect the soft tissue around them, resulting in perceived decreases in facial volume,” Paskhover said. “Treatment should consider the underlying bone structure.”

The report was published online recently in JAMA Facial Plastic Surgery.

WebMD News from HealthDay


SOURCE: Rutgers University, news release, June 2019

Copyright © 2013-2018 HealthDay. All rights reserved.

‘); } else { // If we match both our test Topic Ids and Buisness Ref we want to place the ad in the middle of page 1 if($ .inArray(window.s_topic, moveAdTopicIds) > -1 && $ .inArray(window.s_business_reference, moveAdBuisRef) > -1){ // The logic below reads count all nodes in page 1. Exclude the footer,ol,ul and table elements. Use the varible // moveAdAfter to know which node to place the Ad container after. window.placeAd = function(pn) { var nodeTags = [‘p’, ‘h3′,’aside’, ‘ul’], nodes, target; nodes = $ (‘.article-page:nth-child(‘ + pn + ‘)’).find(nodeTags.join()).not(‘p:empty’).not(‘footer *’).not(‘ol *, ul *, table *’); //target = nodes.eq(Math.floor(nodes.length / 2)); target = nodes.eq(moveAdAfter); $ (”).insertAfter(target); } // Currently passing in 1 to move the Ad in to page 1 window.placeAd(1); } else { // This is the default location on the bottom of page 1 $ (‘.article-page:nth-child(1)’).append(”); } } })(); $ (function(){ // Create a new conatiner where we will make our lazy load Ad call if the reach the footer section of the article $ (‘.main-container-3’).prepend(”); });
WebMD Health

Bones Help Black People Keep Facial Aging at Bay

TUESDAY, June 11, 2019 — Why do so many black adults continue to look youthful as they age?

A new study says it’s in their bones.

Researchers found that the facial bones of black adults retain a higher mineral content than those other races, which makes their faces less likely to reflect their advancing years.

The new study is the first to document how facial bones change as black adults age, and may help guide plastic surgeons’ work.

“It is important for plastic surgeons to understand how the facial aging process differs among racial and ethnic groups to provide the best treatment,” said study author Dr. Boris Paskhover. He is an assistant professor at Rutgers New Jersey Medical School, in Newark.

For the study, his team looked at medical records of 20 black adults from 1973 and 2017. The study patients had at least two face scans taken 10 years apart.

Although all of the faces changed over time, they showed only minor changes, compared to similar studies on the aging white population.

“This finding reflects other studies that show black adults have higher bone mineral density, decreased rates of bone loss and lower rates of osteoporosis as compared to the general population,” Paskhover said in a university news release.

Facial aging results from a combination of changes to the skin, muscle, fat and bones.

As people age, the loss of mineral density causes bone loss. Bone loss can affect the shape of the nose, lower jowl area, cheekbones, and middle and lower areas of the eye sockets, the researchers explained.

“As bones change, they affect the soft tissue around them, resulting in perceived decreases in facial volume,” Paskhover said. “Treatment should consider the underlying bone structure.”

The report was published online recently in JAMA Facial Plastic Surgery.

More information

Harvard Medical School has more about facial aging.

© 2019 HealthDay. All rights reserved.

Posted: June 2019 – Daily MedNews

Reallusion Launches Cartoon Animator 4, Facial Mocap Plug-in

Reallusion has launched Cartoon Animator 4 (formerly CrazyTalk Animator), the complete professional 2D character system, and it’s new Facial Mocap Plug-in (Motion LIVE 2D), which allows users to animate characters with their own facial expression in real time. In the last decade, Cartoon Animator has been widely used by million-subscriber YouTube channels to online businesses, graphic designers, marketers and award-winning directors.

“Cartoon Animator 4 is one of the most accessible 2D character design and animation tools in the current market for both entry and professional users.” said John C. Martin, VP of Product Marketing, Reallusion. “CTA 4 is a complete 2D character system with tools to design and animate in a new way with a unique motion UI and the industry-breakthrough 2D, 360 degree head creator. Indie, pros and first time animators can apply speed to creativity with a new approach to 2D animation.”

Cartoon Animator 4 is a total 2D animation toolbox that can turn images into animated characters, generate lip-sync animation from audio, accomplish 3D parallax scenes, produce 2D visual effects, access content resources, and wield a comprehensive Photoshop pipeline to rapidly customize and create characters.

With Cartoon Animator’s Facial Mocap Plug-in, anyone can animate character with their facial performances via webcams or an iPhone TrueDepth camera to track expressions with head and eyes movements, and natural body animations driven by head position. This solution is designed for virtual production, performance capture, live TV shows and streaming web broadcasting. Key features include:

  • Real-time Face Tracking via Webcam and/or iPhone Users can utilize any webcam or iPhone X to capture real-time face tracking via Facial Mocap Plug-in, the facial expressions will instantly project onto virtual characters in Cartoon Animator.
  • Head Driven Body Movement During facial mocap, users can also add upper body motions by capturing head movements, blending values, and adjusting arm or forearm rotation that can be directly blended during live performances.
  • Real-time Lip Sync and Audio Recording The Timeline editor can edit motion clips, alter speeds, blend and refine captured phoneme expressions. Turn on the PC microphone for simultaneous audio recording for complete control over talking lip shapes.

The 360 Head Creator streamlines the workflow for head creation and expression setup, while directly applying it to the animation core through face key editing, puppeteering and the timeline system. It transforms static 2D art into 3D-styled characters with up to 360 degrees of motion for deeply rich performances. Artists can also use the Photoshop round trip in/out integration for editing multi-angle characters.

The Smart IK (Inverse Kinematic) Animation’s simple and functional design sets Cartoon Animator apart from other 2D applications, as the intuitive IK/FK system auto-switches for a fluid, and logical workflow. Smart Motion Retargeting correctly applies any motion files to different body shapes, while automatically aligning new characters to the original motion pose.

Cartoon Animator 4 is available for $ 199 (Pipeline edition) or $ 99 (PRO edition). PC/Mac webcam facial tracker is a separate module along with Motion LIVE 2D plug-in ($ 199). More information at

Animation Magazine

‘Vampire Facial’ Spa Clients to be Tested for HIV

Sept. 14, 2018 — Customers of an Albuquerque spa who received a “vampire facial” are being warned that they may have been exposed to HIV, hepatitis B and hepatitis C through human blood used in the controversial procedure.

In a vampire facial — popularized by Kim Kardashian and other celebrities — a patient’s blood is drawn and then injected back into the face using micro-needles.

But the New Mexico Department of Health says an inspection at the VIP Spa in Albuquerque detected practices that might spread HIV or hepatitis B and C, CBS News reported.

“We undertook the inspection because a client of the VIP Spa developed an infection that may have resulted from a procedure performed at the spa,” state epidemiologist Dr. Michael Landen told CBS.

His agency is urging that anyone who received a vampire facial at the spa get tested as soon as possible, especially if the procedure happened in May or June. Testing would be free and confidential, they added.

WebMD News from HealthDay

Copyright © 2013-2018 HealthDay. All rights reserved.

WebMD Health

Facial Cancer Decimating Tasmanian Devil Population

TUESDAY, Feb. 20, 2018 — A contagious cancer has put Tasmanian devils at risk for extinction in the wild, researchers say.

The contagious cancer, known as facial tumor disease, is found only in Tasmanian devils — notoriously cantankerous animals now found in the wild only on the island of Tasmania. The cancer is spread through biting, which is common among devils during mating and feeding.

It kills infected animals within 6 to 12 months. There’s no known vaccine or cure.

Since the disease emerged, devil populations in the wild have declined by about 80 percent, according to the researchers.

“We are now dealing with very small and potentially isolated groups of devils across Tasmania,” lead researcher Billie Lazenby, a wildlife biologist with the Save the Tasmanian Devil Program, said in a new release from San Diego Zoo Global.

“The ongoing impact of [the disease], which continues to cause high mortality in devils, could make them vulnerable to other threats,” she said.

The researchers also found that remaining wild populations of Tasmanian devils are showing some reproductive changes, possibly in response to the disease.

“Devils in diseased areas are now breeding younger and having more pouch young, which has allowed them to persist at low levels in the wild,” Mathias Tobler, a population sustainability scientist with San Diego Zoo Global, said in the news release. Tasmanian devils are marsupials — animals that carry their young in pouches.

“This research has shown the structure of the wild devil populations in diseased areas has shifted dramatically, with devils over the age of 2 being very rare, compared to sites before [the disease] emerged,” Tobler said. “Earlier breeding in young devils means that they are contracting [the disease] younger, often as 1-year-olds.”

Though breeding at an earlier age has enabled wild populations of devils to continue, their low numbers make extinction more likely.

“Such large reductions in their numbers and the change in their age structure means their populations are impacted more by other threats,” David Pemberton, program manager for the Save the Tasmanian Devil Program, said in the news release. Those threats, he said, include “roadkill, bushfire, loss of genetic diversity, variation in food availability caused by drought and changes in the ecosystem as it responds to the loss of devils in the wild.”

“Efforts to manage the devils, such as the development of an immunotherapy, are ongoing, but remain in a research-and-development phase,” he said.

The research was published online recently in the Journal of Applied Ecology.

More information

The Tasmania Parks & Wildlife Service has more on the Tasmanian devil.

© 2018 HealthDay. All rights reserved.

Posted: February 2018 – Daily MedNews

SIGGRAPH: Facial Animation Advances, Fabric Engine & the French Contingent

NVIDIA deep learning facial animation

NVIDIA and Remedy deep learning facial animation

Adobe Research has built on its Project StyLit experiments to develop a new project called “Stylized Facial Animation” — a tool for automating rotoscopy which can apply any selected artistic style, from a pastel sketch to a bronze statue, onto video footage of actors’ faces. Adobe will hold a public demo of the technology on Thursday, August 3 at SIGGRAPH, or you can read up on the project in Digital Trends’ interview with Senior Principal Scientist David Simons.

Following its presentation at Tuesday’s “Open Source at Pixar: USD and OpenSubdiv” Exhibitor Session, Animal Logic has launched its USDMaya plugin (known as “AL_USDMaya”) as an open source tool. The plugin enables powerful and efficient authoring and editing of 3D graphics data using Pixar’s open source Universal Scene Description (USD) in Autodesk’s Maya. The aim is to leverage the strengths of both Maya and USD, and use each one to represent the data it’s most suited to.

Animal Logic uses AL_USDMaya to allow native Maya entities such as complex Maya Rigs loaded as references, and other assets not easily represented in USD, to be embedded in USD scenes and imported into Maya in their native form. The plugin maintains a “live” connection between the USD and Maya scene, and can respond to various events, keeping the Maya and USD scenes in sync. This affords a dynamic user experience that allows artists to swap in and out different representations of objects in their scene, including Rigs for geometry caches and different levels of detail. Additionally, heavyweight scene elements such as sets and crowds can be represented in OpenGL in the Maya viewport, and manipulated either with USD or Maya tools and UI components. The source code is available now.

Fabric Software ( introduced Fabric Engine 2.6, with key advancements for VFX, games and VR studios. The update is available in August for existing customers.

+      Asset Patterns offer a rules-based way to control data flow to and from many different industry formats with support for change propagation – so modeling and other asset modifications made in an authoring application can be quickly pushed to game engines like Unity and Unreal without a full import process.

+      Manipulation Framework lets technical directors and developers tailor the user experience of using Fabric to create applications and animation rigs that work anywhere in a production pipeline without being limited to what is available in DCC applications like Maya.

+      Rigging Enhancements, including support for the Autodesk’s newly open-sourced AnimX curve library; and a new RBF (Radial Basis Function) solver ideal for pose-based deformations that need to be solved in real-time, such as facial animation or secondary animation of cloth over a muscle system.

+      Universal Scene Description Support, tapping in to Pixar’s open-source USD library for handling vast, complex scenes. “We believe that USD could be a key technology for real-time applications used for visualization, VR and AR,” says Fabric Software CEO Paul Doyle. “There is no industry standard yet for how we store and work interactively with data at scale. Customers are telling us that USD might be it. With our new Asset Patterns functionality, we see Fabric Engine as an enabling technology for USD that could accelerate adoption and help developers standardize their pipelines.”

NVIDIA and game developer Remedy (Alan Wake, Quantum Break) showcased their team-up solution to streamlining motion capture and animation using a deep learning neural network, running on NVIDIA’s powerful DGX-1 server. After being “trained” with information on previously produced animations, the network is able to generate sophisticated 3D facial animation from videos of live actors, greatly alleviating the time and labor burden of traditional mo-cap animation — it can even learn enough to generate facial animation from just an audio clip. The companies believe this system could eventually produce animation that’s just as good or better than traditionally produced fare.

“Based on the Nvidia Research work we’ve seen in AI-driven facial animation, we’re convinced AI will revolutionise content creation,” said Antti Herva, lead character technical artist at Remedy. “Complex facial animation for digital doubles like that in Quantum Break can take several man-years to create. After working with Nvidia to build video- and audio-driven deep neural networks for facial animation, we can reduce that time by 80% in large scale projects and free our artists to focus on other tasks.”

Attendees should seek out the Pavillon France to meet the companies selected by Cap Digital and Imaginove to represent the finest in French tech and innovation. The 16 exhibitors in the 24-company contingent are seeking to connect with major players from around the world, with the support of the CNC: Eisko ( – specializing in 3D celeb representations for VFX and interactive; 4DViews ( – dynamic volumetric capture system; Dynamixyz ( – facial performance capture software; SPARTE ( – solutions for VR, 3D printing, etc.; Stimergy ( – green energy render farm rental; Ranch Computing ( – render farm & 3D modeling/calculation services; Qarnot Computing ( – fully-featured green render farm as a cloud of home heaters; Persistant Studios ( – PopcornFX real-time 3D particle effects middleware; Mercenaries Engineering ( – 3D software for animation & VFX, Guerilla Render; Golaem ( – crowd simulation, layout & previz tools; Texels ( – Kurtis pipeline creation and automation software for animation & VFX; Mikros Image ( – post-production for animation, feature films & commercials; TVPaint Animation ( – bitmap-based 2D software; Composite Films ( – Film restoration/colorization; Kolor (Gopro division) ( – software solutions for immersive experiences; and Speedernet-Sphere ( – software for 260 and VR experiences for web.

NVIDIA and Remedy deep learning facial animation

NVIDIA and Remedy deep learning facial animation

Animation Magazine

You Can Probably Credit Your Genes for These Facial Features

FRIDAY, April 21, 2017 — How likely is your baby to have Mom’s button nose or Dad’s chiseled cheeks?

New research suggests the odds are good.

Based on a study of nearly 1,000 twin girls in the United Kingdom, researchers say they’ve pinpointed the facial features most influenced by a person’s genes.

The finding: Genetics play a big part in the shapes of the end of the nose, the area above and below the lips, cheekbones and the inner corner of the eyes.

The King’s College London study was published April 19 in the journal Scientific Reports.

“The notion that our genes control our face is self-evident,” lead researcher Giovanni Montana said in a college news release. “However, quantifying precisely which parts of the face are strongly heritable has been challenging so far.”

Using 3D models, his team created what Montana described as “face heritability maps.”

“These maps will help identify specific genes shaping up the human face, which may also be involved in diseases altering the face morphology,” he explained. Montana is a professor in the department of biomedical engineering.

Tim Spector, director of the TwinsUK study at the college, said the research explains how we see twins who don’t look exactly like.

It “shows us that even identical twins can vary quite a lot on facial features, but because of the key areas being genetically controlled, we perceive them as being ‘identical,’ ” he said in the news release.

More information

Learn more about genetics at the American Society of Human Genetics.

Posted: April 2017

Recommended for you

View comments – Daily MedNews

FDA Warns of Complications From Facial Fillers

FRIDAY, May 29, 2015 (HealthDay News) — Soft tissue fillers used in cosmetic procedures can accidentally be injected into blood vessels in the face and cause serious harm, the U.S. Food and Drug Administration warns.

The fillers are approved to treat wrinkles or to enhance cheeks or lips.

Injection of facial fillers into blood vessels can cause blockages that restrict blood supply to tissues. Filler material injected into blood vessels can also travel to other areas and cause stroke, vision problems, blindness and damage and/or death of the skin and underlying facial structures, the agency said in a news release.

Accidental injections of facial filler into blood vessels can occur anywhere on the face. But an FDA analysis of studies and reported problems found it was most likely to occur between the eyebrows and nose, in and around the nose, on the forehead, and around the eyes, the agency said.

The FDA has told makers of facial fillers to update their labeling to include additional warnings about the risk of accidental injection into blood vessels.

Before going ahead with soft tissue filler injections, patients should talk with their doctor about appropriate treatment injection sites and the risks associated with the procedure, and read the product’s labeling, the FDA said.

It’s also important to ask about the doctor’s training and experience injecting soft tissue fillers in the face, the agency said.

Patients should seek immediate medical attention if they experience any of the following during or shortly after the procedure: unusual pain; vision changes; a white patch of skin near the injection site; or signs of stroke, such as sudden difficulty speaking, numbness or weakness in the face, arms or legs, difficulty walking, face drooping, severe headache, dizziness or confusion.

Doctors should inject soft tissue fillers only if they have appropriate training and experience, and should be familiar with each patient’s blood vessel anatomy, which can vary between people, the FDA said.

Doctors also need to fully inform patients about the risks associated with the procedure, know the signs and symptoms of accidental injection of facial filler into blood vessels, and have a plan for treating patients if this occurs.

WebMD Health

Call of Duty: Advanced Warfare Uses Version of “Avatar 2” Facial Animation System

This fall’s Call of Duty: Advanced Warfare uses a facial animation system powered by the same technology and techniques that will be used to produce James Cameron’s upcoming movie Avatar 2. That’s according to a teaser for Edge’s story about Advanced Warfare.

Publisher Activision previously said that Advanced Warfare will deliver a “near photorealistic” world. We’ve already seen some of the game’s facial animations, as a story trailer for Advanced Warfare got up close with actor Kevin Spacey’s impressive-looking character, Jonathan Irons.

Cameron’s original Avatar hit theaters in 2009. Avatar 2 is expected to be released in December 2016.

According to Activision, the new three-year Call of Duty development cycle meant that Advanced Warfare developer Sledgehammer Games was able to create a “near photorealistic world unlike any Call of Duty before.” At GDC 2013, Activision R&D staffers showed off some seriously staggering next-gen character models, though they cautioned that they do not represent actual game assets.

Advanced Warfare launches November 4 for Xbox 360, Xbox One, PlayStation 3, PlayStation 4, and PC. Sledgehammer Games is working on the Xbox One, PS4, and PC versions, while a studio to be named later is handling the Xbox 360 and PS3 iterations. For more on Advanced Warfare, check out GameSpot’s previous coverage.

Gamespot’s Site Mashup

Baseballs, Softballs Often to Blame for Kids’ Facial Fractures

SATURDAY July 6, 2013 — Most sports-related facial fractures among children occur when they’re trying to catch a baseball or softball, according to new research. These injuries are relatively common, and they can be serious.

The new study examined how and when facial fractures occur in various sports. “These data may allow targeted or sport-specific craniofacial fracture injury prevention strategies,” wrote study leader Dr. Lorelei Grunwaldt and colleagues at the Children’s Hospital of Pittsburgh.

The study, published in the June issue of the journal Plastic and Reconstructive Surgery, involved 167 children and teens treated in the emergency room for a sports-related fracture between 2000 and 2005. About 11 percent of facial fractures were sports-related, according to a journal news release.

Of the children treated for facial fractures, roughly 80 percent were boys. Nearly two-thirds were between 12 and 15 years old. Forty percent of injuries were broken noses, 34 percent were fractures around the eye and 31 percent were skull fractures.

Forty-five percent of the children were admitted to the hospital for their injuries. Of these, 15 percent were sent to the intensive-care unit. Roughly 10 percent of the children lost consciousness and 4 percent had more significant injuries considered a level-one trauma (the worst trauma), including an unstable airway or spinal cord injury.

Many of the children were injured by a ball while they were attempting to catch it and 44 percent of the fractures involved baseball and softball. Most often, players were inured while attempting to field a line drive.

Meanwhile, only 10 percent of the cases occurred while playing basketball and football. Fractures sustained during these two sports, as well as most in soccer, occurred when players collided. Collisions with another player accounted for 24.5 percent of injuries, the study showed.

Falls also were to blame for about 19 percent of injuries. Other patterns of fractures were seen in specific sports. For instance, golf injuries most often happened at home when a club struck a child.

All facial fractures from skiing and snowboarding and most from skateboarding occurred in children who were not wearing helmets. Most horseback riding fractures were the result of being kicked by a horse. Although horseback riding and skateboarding injuries occurred less frequently, they were more serious. Of these, 29 percent of horseback riding injuries and 14 percent of skateboarding injuries were classified as a level-one trauma.

The study authors said their findings could help prevent future injuries in children. For instance, skateboarders, skiers and snowboarders should always wear helmets to reduce their risk for fracture. Nasal protectors can also prevent some fractures for basketball and soccer players. Softer, low-impact balls also have been recommended for youth baseball and softball. The researchers added that more protective equipment may be beneficial for outfielders playing baseball or softball.

“Our strongest recommendation for injury prevention may be further consideration of face protective equipment for players fielding in baseball and softball,” the researchers wrote.

More information

The American Association of Oral and Maxillofacial Surgeons has more about treating and preventing facial injuries.

Posted: July 2013

View comments – Daily MedNews

People With Borderline Personality Disorder May Misinterpret Facial Emotions

THURSDAY May 23, 2013 — Symptoms of borderline personality disorder often mimic traits of other psychiatric disorders, complicating diagnosis and treatment. But researchers in Canada say they have identified a characteristic that may be unique to borderline personality disorder: a tendency to misinterpret emotions expressed by the face.

“They have difficulty processing facial emotions and will see a negative emotion on a neutral face,” said Anthony Ruocco, a clinical neuropsychologist and assistant professor at the University of Toronto. “This is not seen in bipolar disorder or schizophrenia.”

Inaccuracies in recognizing anger, sadness, fear and disgust also were noted in Ruocco’s recent study, with greater deficits related to anger and disgust.

Although more research is needed to understand the brain mechanisms involved in these misperceptions and their significance, Ruocco called these “potentially important” deficits.

“There may be neurobiological factors that contribute to these biases in emotion perception,” he said. Pinpointing those factors might lead to better understanding of the illness and improved treatments.

But whether these misperceptions trigger the outbursts so common among people with borderline personality disorder was not within the scope of the studies.

People with borderline personality disorder have trouble regulating their emotions. They tend to act impulsively, lash out in anger and have stormy relationships. Many fear abandonment, complain of feeling empty and engage in bodily self-harm, such as cutting. High rates of suicide also are associated with the condition.

Ruocco discussed the study, published in the journal Psychological Medicine, at recent conferences co-sponsored by the National Education Alliance for Borderline Personality Disorder in Boston and New Haven, Conn.

His research, called a meta-analysis, involved a review of 10 previously published studies on emotion recognition. More than 500 people were involved: 266 with borderline personality disorder and 255 mentally healthy people, or controls. Participants were overwhelmingly female, with an average age of 29.

Not only did subjects with borderline personality disorder misread facial emotions, the studies showed, but they also took more time to interpret facial emotions than others. And when they perceived anger, it induced stronger reactions than in healthy control subjects, Ruocco’s team found.

Two of the studies analyzed noted that patients were more likely to report negative emotions when viewing faces displaying no emotion.

“Neutral faces are simply faces that were produced by actors with the intention of showing no emotion,” Ruocco said. “Typically, researchers have tested these faces with healthy individuals and confirmed that nearly all people perceive them as neutral.”

A smaller study conducted by Ruocco’s team, which is not yet published, found that individuals with borderline personality disorder also reported mild expressions of sadness as more intensely sad than others do. This was true even when the researchers accounted for depression.

About 6 percent of adults in the United States have borderline personality disorder, according to the National Education Alliance for Borderline Personality Disorder, which also estimates that 20 percent of all psychiatric hospital admissions and 10 percent of outpatient treatment involve the condition.

More information

The National Education Alliance for Borderline Personality Disorder has more about borderline personality disorder.

Posted: May 2013

View comments – Daily MedNews

Gene Mutation Linked to Facial, Skull Abnormalities

THURSDAY July 12, 2012 — A gene mutation linked to facial and skull abnormalities, as well as impairment of thinking skills, has been identified by scientists who confirmed their findings in zebrafish.

Although the discovery may not lead to a cure for people with the mutated gene, it could someday lead to genetic screening and possibly early interventions, suggested the researchers from the Medical College of Georgia at Georgia Health Sciences University in Augusta.

In conducting the study, the scientists examined patients with a rare disorder known as Potocki-Shaffer syndrome, which can result in a small head and chin and intellectual disability. The investigators found that the patients had a mutated PHF21A gene. They were able to isolate the gene using a distinct chromosomal break found in patients with the condition.

“We call this breakpoint mapping, and the breakpoint is where the trouble is,” study co-author Dr. Lawrence Layman, chief of the college’s section of reproductive endocrinology, infertility and genetics, explained in a university news release.

The investigators pointed out that damaged genes may not function normally, and, in this case, PHF21A’s functioning was reduced by about half. To confirm the role this gene played in the abnormalities seen in head formation and brain skills, they suppressed the gene in zebrafish. The study revealed the fish developed similar head and brain defects.

“With less PHF21A, brain cells died, so this gene must play a big role in neuron survival,” said the study’s lead author, Dr. Hyung-Goo Kim, a molecular geneticist at the university.

To verify their findings, the researchers unsuppressed the gene in the zebrafish. Because they have some ability to regenerate, the fish returned to normal. It is important to note that research involving animals often doesn’t have implications for humans, and humans can’t be cured in the same way as zebrafish.

The researchers said, however, that the study provides insight into face, skull and brain formation. The findings could lead to the development of genetic screening or intervention in the womb to increase PHF21A levels.

More research is needed to identify other genes regulated by PHF21A and screen patients for additional mutations, the study authors added.

“We want to find other people with different genes causing the same problem,” Layman said. “Now that we know the causative gene, we can sequence the gene in more patients and see if they have a mutation.”

The study was published this month in the American Journal of Human Genetics.

More information

The U.S. Office of Rare Diseases Research has more about Potocki-Shaffer syndrome.

Posted: July 2012

Please enable JavaScript to view the comments powered by Disqus. – Daily MedNews

Most Adults With Facial Disfigurement Adapt Psychologically

TUESDAY Jan. 3, 2012 — Adults who were born with a severe facial disfigurement have generally good psychological adjustment, according to a small new study.

Dutch researches gave a set of psychological, physical and demographic questionnaires to 59 adults, average age 34, who were born with severe facial disfigurement caused by rare, extensive facial cleft syndromes.

The same questionnaires were also completed by 59 adults with facial disfigurement caused by traumatic injury and 120 adults with no disfigurement.

The researchers found that those born with facial disfigurement had “relatively normal” psychological functioning but they did tend to have a higher rate of problems such as anxiety and depression than those with no disfigurement.

However, adults born with facial disfigurement and those with trauma-related facial disfigurement were no more likely to have a clinical level of depression and anxiety than those without facial disfigurement.

Perhaps not surprisingly, those born with facial disfigurement had lower rates of physical problems than those with trauma-related facial disfigurement, the researchers said.

Among people born with severe facial disfigurement, problems with psychological functioning were more common among those with low self-esteem and those who were concerned about how others would judge their appearance.

That finding is an important consideration for plastic and reconstructive surgeons, the researchers said.

“Improving satisfaction with facial appearance (by surgery), enhancing self-esteem or lowering fear of negative appearance evaluation (by psychological support) may enhance long-term psychological functioning,” concluded Dr. Sarah Versnel of Erasmus University Medical Center, Rotterdam, and colleagues in a journal news release.

The study appears in the January issue of the journal Plastic and Reconstructive Surgery.

More information

The Children’s Craniofacial Association has more about craniofacial syndromes.

Posted: January 2012

Please enable JavaScript to view the comments powered by Disqus. – Daily MedNews

Bipolar Kids May Focus on Different Facial Features

THURSDAY Nov. 17, 2011 — Children with bipolar disorder and a similar condition called severe mood dysregulation spend less time looking at the eyes when trying to identify facial features, compared to children without the psychiatric disorders, researchers say.

This new study finding may help explain why children with bipolar disorder and severe mood dysregulation have difficulty determining other people’s emotional expressions, said the U.S. National Institute of Mental Health investigators.

The researchers tracked the eye movements of children with and without psychiatric disorders as they viewed faces with different emotional expressions, such as happy, sad, fearful and angry. In general, the children spent more time looking at the eyes, the facial feature that conveys the most information about emotion.

However, children with bipolar disorder and severe mood dysregulation paid less attention to the eyes and more attention to the noses and mouths of the faces.

The study was presented this week at the annual meeting of the Society for Neuroscience in San Diego.

“In combination with other studies, our findings indicate the potential value of treatment programs that teach children how to identify emotions by looking at others’ eyes,” study author Pilyoung Kim said in a society news release.

“If such training helps children to process the emotional information in their world more accurately, that may in turn increase their ability to regulate their emotional reactions to social situations,” Kim added.

Research presented at meetings should be considered preliminary until published in a peer-reviewed medical journal.

More information

The U.S. National Institute of Mental Health has more about bipolar disorder in children and teens.

Posted: November 2011

Please enable JavaScript to view the comments powered by Disqus. – Daily MedNews