Tuesday, October 27, 2009

Heart Attack Risk Reduction: The Low-Hanging Fruit

Dr. Yongsoon Park and colleagues recently published a great article in the British Journal of Nutrition titled "Erythrocyte fatty acid profiles can predict acute non-fatal myocardial infarction". Stated simply, the title says that the fat in your red blood cell membranes, which reflects dietary fat composition, can predict your likelihood of having a heart attack*. More accurately than standard measures of heart attack risk such as blood cholesterol.

Let's cut to the data. The investigators examined the fat composition of red blood cells in people who had suffered a heart attack, versus an equal number who had not. Participants who had heart attacks had less omega-3, more long-chain omega-6, and particularly higher trans fat in their red blood cells. In fact, 96% of the heart attack patients had elevated trans fat levels, compared to 34% of those without heart attacks. This is consistent with a number of other studies showing a strong association between blood levels of trans fat and heart attack risk (ref).

92% of heart attack patients were in the lowest category of EPA in their red blood cells, as opposed to 32% of those without heart attacks. EPA is an omega-3 fat that comes from fish, and is also made by the body if there's enough omega-3 alpha-linolenic acid (think flax and greens) around and not too much linoleic acid (industrial vegetable oil) to inhibit its production. 96% of heart attack patients were in the lowest category for alpha-linolenic acid, compared to 34% of the comparison group. 0% of the heart attack patients were in the highest category for alpha-linolenic acid.

62% of heart attack patients were in the highest category of arachidonic acid (AA), compared to 34% of the comparison group. AA is made from linoleic acid, and is also found in animal foods such as eggs and liver. Animal foods from pasture-raised animals are lower in AA than their conventionally-raised counterparts, and also contain more omega-3 fats to balance it.

The investigators found that low omega-3, high AA and high trans fats in red blood cells associate with heart attack risk far better than the Framingham risk score, a traditional and widely-used measure that incorporates age, sex, smoking status, total cholesterol, HDL, hypertension and diabetes.

If the associations in this study represent cause-and-effect, which I believe they do based on their consistency with other observational studies and controlled trials, they imply that we can have a very powerful effect on heart attack risk by taking a few simple steps:
  1. Avoid trans fat. It's found in margarine, shortening, refined soy and canola oils, many deep fried foods and processed foods in general.
  2. Avoid industrial vegetable oils and other sources of excess omega-6. Eating pastured or omega-3 eggs, rather than conventional eggs, can help reduce dietary AA as well.
  3. Ensure a regular intake of omega-3 fats from seafood, or small doses of high-vitamin cod liver oil or fish oil. Flax oil is also helpful, but it's an inferior substitute for fish oil.
This study was conducted in Korea. It's a striking confirmation that basic nutritional principles span races and cultures, likely affecting disease risk in all humans.

In the future, I hope that most doctors will measure blood fatty acids to predict heart attack risk, with more success than current approaches. Instead of measuring cholesterol and prescribing a statin drug, doctors will prescribe fish oil and easy-to-follow diet advice**. Fortunately, some doctors are beginning to measure red blood cell fatty acid levels in their patients. The forward-thinking cardiologist Dr. William Davis has discussed this on his blog here. Take a good look at the graphs he posted if you get the chance.


*The title of the study is misleading because it implies a prospective design, in which blood fatty acids would be measured and volunteers followed to see who develops heart disease at a later time point. This study was cross-sectional (also called case-control), meaning they found people who had just had a heart attack and measured their blood fatty acids retrospectively. The other study I referenced above was prospective, which is a nice confirmation of the principle.

**"Eat butter on your toast. Ditch the margarine."

Wednesday, October 21, 2009

Butter vs. Margarine Showdown

I came across a gem of a study the other day, courtesy of Dr. John Briffa's blog. It's titled "Margarine Intake and Subsequent Coronary Heart Disease in Men", by Dr. William P. Castelli's group. It followed participants of the Framingham Heart study for 20 years, and recorded heart attack incidence*. Keep in mind that 20 years is an unusually long follow-up period.

The really cool thing about this study is they also tracked butter consumption. So it's really a no-holds barred showdown between the two fats. Here's a graph of the overall results, by teaspoons of butter or margarine eaten per day:

Heart attack incidence increased with increasing margarine consumption (statistically significant) and decreased slightly with increasing butter consumption (not statistically significant). That must have been a bitter pill for Castelli to swallow!

It gets better. Let's have a look at some of the participant characteristics, broken down by margarine consumption:

People who ate the least margarine had the highest prevalence of glucose intolerance (pre-diabetes), smoked the most cigarettes, drank the most alcohol, and ate the most saturated fat and butter. These were the people who cared the least about their health. Yet they had the fewest heart attacks. Imagine that. The investigators corrected for the factors listed above in their assessment of the contribution of margarine to disease risk, however, the fact remains that the group eating the least margarine was the least health conscious. This affects disease risk in many ways, measurable or not. I've written about that before, here and here.

Can this study get any better? Yes it can. The investigators broke down the data into two halves: the first ten years, and the second ten. In the first ten years, there was no significant association between margarine intake and heart attack incidence. In the second ten, the group eating the most margarine had 77% more heart attacks than the group eating none:

So it appears that margarine takes a while to work its magic.

They didn't publish a breakdown of heart attack incidence with butter consumption over the two periods. Perhaps they didn't like what they saw when they crunched the numbers. I find it really incredible that we're told to avoid dairy fat with data like these floating around. The Framingham study is first-rate epidemiology. It fits in perfectly with most other observational studies showing that full-fat dairy intake is not associated with heart attack and stroke risk. In fact, several studies have indicated that people who eat the most full-fat dairy have the lowest risk of heart attack and stroke.


It's worth mentioning that this study was conducted from the late 1960s until the late 1980s. Artificial trans fat labeling laws were still decades away in the U.S., and margarine contained more trans fat than it does today. Currently, margarine can contain up to 0.5 grams of trans fat per serving and still be labeled "0 g trans fat" in the U.S. The high trans fat content of the older margarines probably had something to do with the result of this study.

That does not make today's margarine healthy, however. Margarine remains an industrially processed pseudo-food. I'm just waiting for the next study showing that some ingredient in the new margarines (plant sterols? dihydro vitamin K1?) is the new trans fat.

Butter, Margarine and Heart Disease
The Coronary Heart Disease Epidemic


* More precisely, "coronary heart disease events", which includes infarction, sudden cardiac death, angina, and coronary insufficiency.

Sunday, October 18, 2009

A Little Hiatus

I'm going to a conference next week, followed by a little vacation. I've written two posts that will publish automatically while I'm gone. I may or may not respond to comments for the next two weeks. I probably won't respond to e-mails. I'll resume the malocclusion series when I get back.

Wednesday, October 14, 2009

Malocclusion: Disease of Civilization, Part IV

There are three periods during the development of the face and jaws that are uniquely sensitive to environmental influences such as nutrition and muscle activity patterns.

1: Prenatal Period

The major structures of the human face and jaws develop during the first trimester of pregnancy. The maxilla (upper jaw) takes form between the 7th and 10th week after conception. The mandible (lower jaw) begins two weeks earlier. The nasal septum, which is the piece of cartilage that forms the structure of the nose and divides the nostrils, appears at week seven and grows most rapidly from weeks 8 to 11. Any disturbance of this developmental window can have major consequences for later occlusion.

2: Early Postnatal Period

The largest postnatal increment in face and jaw growth occurs from birth until age 4. During this period, the deciduous (baby) teeth erupt, and the activity patterns of the jaw and tongue influence the size and shape of the maxilla and the mandible as they grow. The relationship of the jaws to one another is mostly determined during this period, although it can still change later in development.

During this period, the dental arch widens from its center, called the midpalatal suture. This ensures that the jaws are the correct size and shape to eventually accept the permanent teeth without crowding them.

3: Adolescence

The third major developmental period occurs between ages 11 and 16, depending on the gender and individual, and happens roughly at the same time as the growth spurt in height. The dental arch continues to widen, reaching its final size and shape. Under ideal circumstances, at the end of this period the arch should be large enough to accommodate all teeth, including the third molars (wisdom teeth), without crowding. Narrow dental arches cause malocclusion and third molar crowding.

Growth of the Dental Arch Over Time

The following graph shows the widening of the dental arch over time*. The dotted line represents arch growth while the solid line represents growth in body height. You can see that arch development slows down after 6 years old, resumes around 11, and finally ends at about 18 years. This graph represents the average of many children, so not all children will see these changes at the age indicated. The numbers are in millimeters per year, but keep in mind that the difference between a narrow arch and a broad one is only a few millimeters.

In the next few posts, I'll describe the factors that I believe influence jaw and face structure during the three critical periods of development.


* These data represent many years of measurements collected by Dr. Arne Bjork, who used metallic implants in the maxilla to make precise measurements of arch growth over time in Danish youths. The graph is reproduced from the book A Synopsis of Craniofacial Growth, by Dr. Don M. Ranly. Data come from Dr. Bjork's findings published in the book Postnatal Growth and Development of the Maxillary Complex. You can see some of Dr. Bjork's data in the paper "Sutural Growth of the Upper Face Studied by the Implant Method" (free full text).

Saturday, October 10, 2009

Malocclusion: Disease of Civilization, Part III

Normal Human Occlusion

In 1967, a team of geneticists and anthropologists published an extensive study of a population of Brazilian hunter-gatherers called the Xavante (1). They made a large number of physical measurements, including of the skull and jaws. Of 146 Xavante examined, 95% had "ideal" occlusion, while the 5% with malocclusion had nothing more than mild cro
wding of the incisors (front teeth). The authors wrote:
Characteristically, the Xavante adults exhibited broad dental arches, almost perfectly aligned teeth, end-to-end bite, and extensive dental attrition [tooth wear].
In the same paper, the author presents occlusion statistics for three other cultures. According to the papers he cites, in Japan, the prevalence of malocclusion was 59%, and in the US (Utah), it was 64%. He also mentions another native group living near the Xavante, part of the Bakairi tribe, living at a government post and presumably eating processed food. The prevalence of malocclusion was 45% in this group.

In 1998, Dr. Brian Palmer (DDS) published a paper describing some of the collections of historical skulls he had examined over the years (2):
...I reviewed an additional twenty prehistoric skulls, some dated at 70,000 years old and stored in the Anthropology Department at the University of Kansas. Those skulls also exhibited positive [good] occlusions, minimal decay, broad hard palates, and "U-shaped" arches.

The final evaluations were of 370 skulls preserved at the Smithsonian Institution in Washington, D.C. The skulls were those of prehistoric North American plains Indians and more contemporary American skulls dating from the 1920s to 1940s. The prehistoric skulls exhibited the same features as mentioned above, whereas a significant destruction and collapse of the oral cavity were evident in the collection of the more recent skulls. Many of these more recent skulls revealed severe periodontal disease, malocclusions, missing teeth, and some dentures. This was not the case in the skulls from the prehistoric periods...
The arch is the part of the upper jaw inside the "U" formed by the teeth. Narrow dental arches are a characteristic feature of malocclusion-prone societies. The importance of arch development is something that I'll be coming back to repeatedly. Dr. Palmer's paper includes the following example of prehistoric (L) and modern (R) arches:


Dr. Palmer used an extreme example of a modern arch to illustrate his point, however, arches of this width are not uncommon today. Milder forms of this narrowing affect the majority of the population in industrial nations.

In 1962, Dr. D.H. Goose published a
study of 403 British skulls from four historical periods: Romano-British, Saxon, medieval and modern (3). He found that the arches of modern skulls were less broad than at any previous time in history. This followed an earlier study showing that modern British skulls had more frequent malocclusion than historical skulls (4). Goose stated that:
Although irregularities of the teeth can occur in earlier populations, for example in the Saxon skulls studied by Smyth (1934), the narrowing of the palate seems to have occurred in too short a period to be an evolutionary change. Hooton (1946) thinks it is a speeding up of an already long standing change under conditions of city life.
Dr. Robert Corruccini published several papers documenting narrowed arches in one generation of dietary change, or in genetically similar populations living rural or urban lifestyles (reviewed in reference #5). One was a st
udy of Caucasians in Kentucky, in which a change from a traditional subsistence diet to modern industrial food habits accompanied a marked narrowing of arches and increase in malocclusion in one generation. Another study examined older and younger generations of Pima Native Americans, which again showed a reduction in arch width in one generation. A third compared rural and urban Indians living in the vicinity of Chandigarh, showing marked differences in arch breadth and the prevalence of malocclusion between the two genetically similar populations. Corruccini states:
In Chandigarh, processed food predominates, while in the country coarse millet and locally grown vegetables are staples. Raw sugar cane is widely chewed for enjoyment rurally [interestingly, the rural group had the lowest incidence of tooth decay], and in the country dental care is lacking, being replaced by chewing on acacia boughs which clean the teeth and are considered medicinal.
Dr. Weston Price came to the same conclusion examining prehistoric skulls from South America, Australia and New Zealand, as well as their living counterparts throughout the world that had adhered to traditional cultures and foodways. From Nutrition and Physical Degeneration:
In a study of several hundred skulls taken from the burial mounds of southern Florida, the incidence of tooth decay was so low as to constitute an immunity of apparently one hundred per cent, since in several hundred skulls not a single tooth was found to have been attacked by tooth decay. Dental arch deformity and the typical change in facial form due to an inadequate nutrition were also completely absent, all dental arches having a form and interdental relationship [occlusion] such as to bring them into the classification of normal.
Price found that the modern descendants of this culture, eating processed food, suffered from malocclusion and narrow arches, while another group from the same culture living traditionally did not. Here's one of Dr. Price's images from Nutrition and Physical Degeneration (p. 212). This skull is from a prehistoric New Zealand Maori hunter-gatherer:


Note the well-formed third molars (wisdom teeth) in both of the prehistoric skulls I've posted. These people had ample room for them in their broad arches. Third molar crowding is a mild form of modern face/jaw deformity, and affects the majority of modern populations. It's the reason people have their wisdom teeth removed. Urban Nigerians in Lagos have 10 times more third molar crowding than rural Nigerians in the same state (10.7% of molars vs. 1.1%, reference #6).

Straight teeth and good occlusion are the human evolutionary norm. They're also accompanied by a wide dental arch and ample room for third molars in many traditionally-living cultures. The combination of narrow arches, malocclusion, third molar crowding, small or absent sinuses, and a characteristic underdevelopment of the middle third of the face, are part of a developmental syndrome that predominantly afflicts industrially living cultures.


(1) Am. J. Hum. Genet. 19(4):543. 1967. (free full text)
(2) J. Hum. Lact. 14(2):93. 1998
(3) Arch. Oral Biol. 7:343. 1962
(4) Brash, J.C.: The Aetiology of Irregularity and Malocclusion of the Teeth. Dental Board of the United Kingdom, London, 1929.
(5) Am J. Orthod. 86(5):419
(6) Odonto-Stomatologie Tropicale. 90:25. (free full text)

Saturday, October 3, 2009

Malocclusion: Disease of Civilization, Part II

The Nature of the Problem

In 1973, the US Centers for Disease Control and Prevention (CDC) published the results of a National Health Survey in which it examined the dental health of American youths nationwide. The following description was published in a special issue of the journal Pediatric Dentistry (1):
The 1973 National Health Survey reported 75% of children, ages 6 to 11 years, and 89% of youths, ages 12 to 17 years, have some degree of occlusal disharmony [malocclusion]; 8.7% of children and 13% of youth had what was considered a severe handicapping malocclusion for which treatment was highly desirable and 5.5% of children and 16% of youth had a severe handicapping malocclusion that required mandatory treatment.
89% of youths had some degree of malocclusion, and 29% had a severe handicapping malocclusion for which treatment was either highly desirable or mandatory. Fortunately, many of these received orthodontics so the malocclusion didn't persist into adulthood.

This is consistent with another survey conducted in 1977, in which 38% of American youths showed definite or severe malocclusion. 46% had occlusion that the authors deemed "ideal or acceptable" (2).

The trend continues. The CDC National Health and Nutrition Examination Survey III (NHANES III) found in 1988-1991 that approximately three fourths of Americans age 12 to 50 years had some degree of malocclusion (3).

The same holds true for Caucasian-Americans, African-Americans and Native Americans in the US, as well as other industrial nations around the world. Typically, only 1/3 to 1/2 of the population shows good (but not necessarily perfect) occlusion (4- 8).

In the next post, I'll review some of the data from non-industrial and transitioning populations.


Malocclusion: Disease of Civilization


1. Pediatr. Dent. 17(6):1-6. 1995-1996
2. USPHS Vital and Health Statistics Ser. 11, no 162. 1977
3. J. Dent. Res. Special issue. 75:706. 1996. Pubmed link.
4. The Evaluation of Canadian Dental Health. 1959. Describes Canadian occlusion.
5. The Effects of Inbreeding on Japanese Children. 1965. Contains data on Japanese occlusion.
6. J. Dent. Res. 35:115. 1956. Contains data on both industrial and non-industrial cultures (Pukapuka, Fiji, New Guinea, U.S.A. and New Zealand).
7. J. Dent. Res. 44:947. 1965 (
free full text). Contains data on Caucasian-Americans and African-Americans living in several U.S. regions, as well as data from two regions of Germany. Only includes data on Angle classifications, not other types of malocclusion such as crossbite and open bite (i.e., the data underestimate the total prevalence of malocclusion).
8. J. Dent. Res. 47:302. 1968 (free full text). Contains data on Chippewa Native Americans in the U.S., whose occlusion was particularly bad, especially when compared to previous generations.