Thursday, February 28, 2008
Jasanoff (1996) - Mad Cow Disease
Note: BSE stands for Bovine Spongiform Encephalopathy (the medical term for Mad Cow)
In this article, Jasanoff describes the United Kingdom's (UK) response to the BSE scare, which involved cattle infections and human cases/deaths.
_______________________
SOME KEY POINTS:
* BSE as a threat to the scientific establishment who thought it knew all the facts. For years, UK health officials acknowledged BSE in cattle but denied there was a risk to humans. In 1996, however, they admitted that human cases arising from consuming BSE-contaminated beef had likely occured.
* Government institutions need public trust as a form of legitimacy. Citizens, moreover, need reassurance that someone is managing risk.
* “Civic dislocation” – British government failed to do its job in warning and protecting the public; trust plummeted.
* Other institutions took its place – everyone from beef producers to restaurants; Accidental Risk Communicators!
* The dangers of having a “trust us, we’re the experts” approach to risk management (prominent in the UK, not so much in the US?).
* A UK risk management process that is insulated from public opinion and input. Do UK risk managers, as a result, downplay uncertainty and avoid dissent?
* “Top-down” versus “bottom-up” risk management – Does the latter (and its emphasis on public engagement) represent a new direction for the British government?
Sheperd (1981) - Selectivity of Sources: The Case of Marijuana
Sheperd was interested in the credentials of sources cited by the media in coverage of the marijuana controversy (i.e. whether the drug is harmful or not). Using a sample of newspaper articles (n=275) about marijuana from 1967 to 1982, he
(1) Examined how frequently actual studies on marijuana's effects were mentioned, and
(2) Compared the freqency of expert source mentions with how often these individuals were cited in the Science Citation Index (SCI).
Shepherd found that only 22% of newspaper articles actually cited studies about the health effects the drug. Far more articles featured "experts" commenting on studies. However, 2/3 of these "experts" had no formal citations in the SCI in regard to medical research on marijuana. So, these individuals were not really experts on marijuana, but perhaps other fields.
SOME THINGS TO CONSIDER:
* “Science celebrities” (but perhaps not experts in a specific field) versus specialists with actual research expertise: who should journalists cite in a story?
* Getting experts to serve as sources is important, but should we care about who reporters are talking to? Should scientists themselves care? After all, the public often can’t evaluate the credentials of a source (at least directly).
* The risk of the media creating an “expert” who is not really an expert (at least in the field in question).
* Citations are not always a mark of respect; you can cite a study to dispute it, for example.
* To what extent should journalists be responsible for evaluating the credibility of a source?
Wednesday, February 27, 2008
Kirby, Scientists on the set
EP: What I like, what I’d like to play up here is the contrast between the innocuous, healthy tissue and the villainous, evil, writhing virus.
Cline: A virus is . . . They can’t actually move.
EP: They can’t?
Cline: No, they can’t.
EP: Oh. (Roach, 1995: 81) (p.269)
Introduction
- There is concern among scientists as well as within the NSF that science in fiction has polluted public understanding of science. In truth, it is more likely that, “the presentation of science in fictional narratives provides an environment in which preexisting attitudes are readily cultivated and reinforced” (p. 263) attributed to Shanahan and Morgan. Kirby believes that such relationships lead not just to the presentation of existing knowledge, but the creation of new knowledge.
- Ultimately, Kirby says that it is not the content along, but the mediation among scientists, the entertainment industry, and audiences that produces the representation of science in entertainment media (p. 263)
Compensation for Consulting
- Movies hire scientists to add realism and legitimacy to their films
- Scientists participate in films for a number of reasons. Most prevalently, scientists do so as a “public service” to improve the public understanding of science. Often they take no compensation at all. However, some scientists make up to $200 an hour for their consultation. Still other scientists feel it is unethical to accept money, and instead accept grants and funding for further research.
- “On the one hand, they believe that as scientists they should give scientific advice freely to anyone who seeks knowledge. On the other hand, they are providing a specialized service for filmmakers and believe they should receive compensation of some type” (p. 266)
- Science organizations have also worked with films, lending or donating expertise, equipment, and locations in exchange for publicity, grant money, etc. An extreme example is NASA, who have an “Entertainment Industry Liaison”
The Role of the Science Consultant in Fictional Science’s Depiction
- Though scientist consultation is highly valued, according to Kirby, a compromise must be struck between ultimate scientific “reality” on the one hand, and the business of film-making on the other. Budgetary constraints as well as stylistic choices both tend to clash with scientists’ advice to a degree.
Science Consultants and the Presentation of “Science” in Fictional Films
- Kirby discusses several ways in which a consultant can advise filmmakers
- Playing a scientists: actors who want to create a ‘realistic’ scientist often work with scientists to develpe their character
- The “look” of science: consultation on the set decoration and props design
- “Factual” content and the inclusion of disputed science: scientists can advise on the accuracy of scientific information. In essence, they are “fact checking.” In many cases, they are advising on information widely agreed upon by the scientific community. In some cases, they are advising on “facts” that are not as widely accepted.
Conclusion and Discussion
- With such a large number of films and televisions shows produced about science, many of which include “inaccurate” science, the public will not be able to tell the scientifically accurate depictions from the wildly inaccurate. In this way, science consultants will not “improve science literacy”
- However Kirby attributes the need for improving literacy to scientists’ view of the deficit model, which has come under heavy criticism. Ultimately, he believes this will not help the deficit view of literacy, but will help with public understanding of science:
- “While increasing ”scientific accuracy“ in fiction may not enhance the public understanding of science as proscribed by the deficit model, the presence of scientists in the filmmaking process can improve the public understanding of science; that is, if we take scientists’ concern with ”public understanding of science“ to mean more than public ”appreciation“ of science. (p. 274)
My Thoughts:
- It was very interesting and fun, and he makes some very strong points about the actual impact on science literacy.
- I wonder if Kirby’s methodology leads to the kind of generalizations he makes about scientists relationship with filmmakers.
- I think he might be oversimplifying the relationship between filmmakers, consultants, and the public.
- While he discusses to the idea that science consultants and films play a role in creating new knowledge, I don’t think he discusses this in enough depth. If I were writing a paper on the subject, I would look elsewhere for a serious discussion, though he does provide one example.
- It seems as though the “appreciation of science” as science literacy model has also come under attack recently.
Tuesday, February 26, 2008
Science in Film
Kirby and Weingart, et. al., both address the portrayal of science in the movies, and what that means for science and society. Kirby sees the intersection of science and cinema grouped around four major research questions: Production (how science is represented in the production of cinematic texts), Content Analysis (how much and what kind of science appears in films), Cultural Meanings (cultural interpretations of science and technology in film), and Media Effects (effect of cinematic portrayals of science on science literacy and public attitudes towards science. Kirby, in particular, is interested in getting away from scientists’ narratives of popular films as detrimental to science literacy (by propagating inaccurate science) and towards a greater appreciation of how films produce and reproduce cultural meanings of science and thereby affect attitudes towards science. In the realm of production of films, filmmakers and scientists have different needs and attitudes towards “authenticity” and “verisimilitude” of science, with scientists concerned primarily with accuracy over an entire film, while filmmakers are concerned with authenticity as a resource to draw in audiences, and subservient to the constraints of budget, time, and narrative. Filmmakers can leverage the authority of scientific consultants to add authenticity to their films, but whether or not they do so, the notion of “authenticity” is contested and negotiated between filmmakers and scientists, and even films with notable inaccuracies from a scientific perspective may still be meaningful and useful for scientific educational purposes. However filmmakers’ claims to authenticity may harm science literacy because they naturalize both “accurate” and “inaccurate” science.
Content analysis is primarily quantitative studies of how much and what kind of science appears in films. Kirby cites Weingart, et. al., in this regard, and doesn’t himself go further into it. Weingart’s analysis of 222 films over eight decades finds that medical sciences appear most often, with psychology at the top, then biology and genetics. Next comes physics and chemistry. Though Kirby considers Weingart’s study primarily content analysis, the category blurs into cultural meanings, as Weingart and colleagues spend most of their time looking at patterns of portrayal of science in film. The frequency of disciplines portrayed is highly related to the frequency of archetypes of scientists and the cultural view of the field of science being done by the scientist. The study finds that over 80 years, most scientists in film are white, American, male, and middle aged, over a third single and another third whose relationships are unknown. Only 18% of scientists are female, and all are younger, more attractive, and lower on the career ladder than male counterparts. Weingart splits depictions into “benevolent” and “mad scientist” poles, with “ambivalent” scientists in between, but notes that even “benevolent” scientists are to some extent “ambivalent” in their portrayal as naïve but well-meaning. Mad scientists are generally associated with biological or psychological disciplines, sometimes with physical sciences, but always willing to cross ethical lines for the sake of knowledge or fame. Anthropology, astronomy, zoology, geology, and the humanities tend to be portrayed positively. Ways in which knowledge is gained include animal and human experiments, field studies/expeditions, knowledge acquired through genius, or by accident. Certain of these methods, biological experimentation and occasionally genius, are more often associated with mad scientists, while astronomy and the humanities are almost never portrayed negatively, although not being associated with genius, they are also often seen as marginal or insignificant. Rarely are scientific methods actually portrayed, usually only the results, with the exceptions being psychological and biological disciplines portrayed negatively due to criminal nature of the methods portrayed. The location of scientific production also correlates with positive or negative portrayal: mad scientists are always seen doing research in secret, outside the scientific community. “Good” science, rather, is public. The highly positive portrayal of anthropologists is connected with action adventure genres. Medical research, genetics, psychology are the fields most often portrayed in conflict with ethics, followed by physics, with astronomy, anthropology, and the humanities mostly not in conflict with ethics. A majority of science in film is fictional or future science, in utopian or dystopian worlds, mostly dystopian, and science usually portrayed just beyond the current state of research. What Weingart, et. al. conclude is that in the twentieth century, films involving science have primarily been concerned with human interventions into humanity itself, creating monsters, robots, hybrids or other abominations which have problematic relations to humanity, or bring up the specter of playing God by meddling with our own natures, which explains the predominance of biology, medicine, and the human sciences in mad scientist portrayals.
Kirby’s own analysis of cultural meanings of science and scientists mostly differ in his findings compared with Weingart. Rather than only mad, ambivalent, and benevolent scientists, he identifies other stereotypes for scientists: helpless scientists (who lose control of their experiments), amoral rationalists (who deny responsibility for the consequences of their research in favor of knowledge at all costs), absent-minded professors, and heroic scientists. Biopics of real scientists are another genre. These would range on Weingart’s scale of benevolent to ambivalent to evil. Kirby notices that particular stereotypes are dominant in certain decades (also with certain fields dominating in certain decades) greatly complicating Weingart’s view. For example, helpless scientists predominated in the first two decades of the twentieth century, mad scientists and biopics concerned with medicine and psychology were dominant from the 20s to the 40s, amoral rationalists in nuclear and space sciences predominated in the 50s, same type but with ecology in the 70s, and helpless scientists with computers and robots in the 80s. Contrary to Weingart, Kirby finds that, in the 90s at least, female scientists were well represented (33%) and portrayed “realistically” and not always conforming to gender stereotypes, although they did correspond to traditional notions of femininity, reinforcing cultural assumptions about womens’ roles in science, were single and without children, and often involved in romances. The last two decades have seen a surge in scientific plots involving genetic engineering (in one sense a resurgence of the 1920’s focus on eugenics), in part indicated by the amount of scholarly activity surrounding Jurassic Park and GATTACA. The main difference between the ’90s and ’20s is that the eugenics films were concerned with harmful effects of state eugenics, while the contemporary fear is about individualistic society’s embrace of genetics.
In terms of media effects, Kirby points to such films as The Day After Tomorrow, Armageddon, Deep Impact, Twister, Dante’s Peak, Mission to Mars, and Space Cowboys as either generating increased awareness (though not necessarily “literacy”) of certain scientific issues such as global warming or Near Earth Objects, or being a tool for involvement in education or propaganda through popular films by institutions such as NASA and the U.S. Geological Survey. Studies of The Day After Tomorrow’s effects on attitudes towards global warming were ambivalent (U.S. and British viewers were more worried or convinced, while Germans were less convinced due to the movie’s version of events in which global cooling followed global warming) but they all agreed that increased awareness resulted regardless of whether or not attitudes changed.
In terms of cultural meanings and media impact, popular films, whether scientifically accurate or not, could be used to portray or substitute for “real” science, and especially in cases of “genre interpenetration” between films, TV shows or documentaries, comic books, computer games, novels and news articles. An example would be the use of Outbreak scenes in coverage of actual Ebola outbreaks in Zaire. The use of Dr. Ehrlich’s Magic Bullet (1940) by the Public Health Service for educational purposes is a similar example. What is important for Kirby is that films involving science, whether scientifically accurate or not, have powerful roles in shaping people’s beliefs, meanings, and attitudes about science, and can and are used to focus public awareness on certain scientific issues and can and do thus affect state policies and the attitudes of scientific communities themselves towards those issues. This focus on the cultural framing of science through film gets away from the usual science literacy focus of films as inaccurate portrayals of science.
Framing of Science and Technology
Frames are organizing ideas or story lines that provide meaning to an event. Journalists use them to package complex issues in persuasive ways by focusing on single interpretations. Social norms and values, pressures from organizations and interest groups, journalistic routines and ideological and political views of the journalists all are factors in frame creation.
Gamson and Modigliani (1989) outline 5 common framing devices (metaphors, exemplars, catchphrases, depictions, and visual images (or icons)). For every theme, there is a counter-theme. For instance, the metaphor of technological progress is countered by Pandora’s Box (runaway technology). The icon of Benjamin Franklin and Thomas Edison have Frankenstein’s monster, Thomas Huxley’s Brave New World, and Charlie Chaplin’s Modern Times. Framing devices provide social heuristics to a commonly-understood narrative or moral teaching.
Dorothy Nelkin (1995) argues that science is often cast as being distinct from politics and clashes of social values. While individual scientists can be criticized as being biased, science as an institution is cast as a neutral source of authority. Technology is often associated with the forward direction (frontiers, battles and struggles), however when described as a crisis, these “runaway forces” require reining in through government regulation.
Sheufele (Scheufele, 1999) sees framing as a continuous process, not merely an output, but also an input into the media. Journalists are also audiences. They are cognitive misers like the rest of us and are equally susceptible to the very frames that they use to describe events and issues. How the elite journalists frame an issue has an effect on how subsequent journalists approach the issue. Who sets the frame first is in a position of power.
Nisbet and Huge (2007) analyzed the way plant biotechnology was framed in the US media and reported that most stories relied on using technical frames, such as reporting on new research, economic and international competitiveness, or property rights. Only a few of the articles used dramatic frames that dealt with the ethics and morality of biotechnology, scientific uncertainty, and public engagement of the issues. Nisbet believes that the largely technical frame in the US media was responsible for plant biotech essentially being a non-issue in this country, as opposed to in Europe.
In a perspectives piece appearing in the journal Science, Nisbet and Mooney (2007) criticized scientists for believing in a public that can be persuaded by mere facts (remember the deficit model?). Instead, citizens use their “value predispositions” (political/religious beliefs) to actively engage with the news, seeking those views that reaffirm their own value-systems (psychological theory: cognitive dissonance). Nisbet and Mooney argue that scientists should avoid emphasizing the technical details of science when trying to defend it in public. In letters to the editor, many reacted negatively to this notion that scientists must “spin” their work, and that those who are trained to debate the issues represent a very small number of experts.
Common Frames in Science/Technology
Social and technological progress (“better living through chemistry”)
Economic development and competitiveness (energy independence)
Soft paths (living harmoniously with the world and its resources)
Public Accountability (Ralph Nader)
Faustian bargain (short-term trade for long-term consequences)
Consensus versus scientific controversy (global warming, intelligent design)
Playing God (stem cell research)
References
Gamson, W. A., & Modigliani, A. (1989). Media Discourse and Public Opinion on Nuclear Power: A Constructionist Approach. American Journal of Sociology, 95(1), 1-37.
Nelkin, D. (1995). Selling science : how the press covers science and technology. New York: Freeman.
Nisbet, M. C., & Huge, M. (2007). Where do Science Debates Come From? Understanding Attention Cycles and Framing. In D. Brossard, J. Shanahan & C. Nesbitt (Eds.), The Public, the Media and Agricultural Biotechnology (pp. 193-230). Cambridge, MA: CABI.
Nisbet, M. C., & Mooney, C. (2007). Framing Science. Science, 316(5821), 56-.
Scheufele, D. A. (1999). Framing as a theory of media effects. Journal of Communication, 49(1), 103-122.
Turow (1989) Playing Doctor
A few trends and ideas Turow recounts:
- The doctor show genre has roots that predate television itself. Old film serials about Doctor Kildare portrayed doctors in saintly/godly fashion. Kildare serials and early doctor television shows portrayed doctors in paternalistic roles, giving necessary operations to patients operations in spite of their objections. Another genre convention established by these serials was the interplay between an older, wiser physician and a young up-and-coming doctor. This, of course, continues today as we see new interns broken in on ER, Grey's Anatomy, House, etc.
- Reality surgery shows aren't a new thing. The first real surgery footage was shown in a doctor show decades ago.
- As doctor shows developed, another important genre element that became prominent was that of the "Grand Hotel." A hospital setting is one in which new character can rotate onto the scene in each week's episode. It's a great plot device and one of the reasons these shows are perennially popular.
- In contrast to saintly TV doctors like Dr. Kildare (who was reinvented in a 1960s TV show) and Marcus Welby, the "bad boy" doctor also became a staple of television during the 1960s. Ben Casey, in a show of the same name, was an antisocial, tempestuous figure who was tolerated by hospital staff only because of his brilliance as a surgeon. This archetype lives on today in the TV show House. Casey and Kildare actually aired during the same years, making them favorite television foils.
- Doctor comedies eventually joined the ranks of doctor dramas, riffing on established genre elements.
- Portrayals of the medical profession have changed considerably over the years, as the influence of the profession behind-the-scenes has waxed and waned. Early on, the AMA contributed advisors to every medical TV show. Their seal actually appeared as a stamp of approval in the credits of the various dramas, assuring audiences that they were authentic. But having AMA approval also meant that the shows had to cater to the medical profession and portray it favorably. Eventually Hollywood wanted out of this arrangement and began producing unauthorized medical shows. The science used in the plots could be shoddy, but the shows no longer idolized the medical profession. Eventually, medical advisers returned to Hollywood shows in less coercive fashion, giving advice, but not orders. There was a return to realism, or perhaps an escalation of it, in the early 1990s with the birth of the show ER. But ER and other shows in its cohort, like Chicago Hope also did something new—they showed doctors as surprisingly fallible and mistake-prone.
- Epilogue: Though Turow doesn't get into it to a great extent in this book, elsewhere he has written on the importance of considering portrayal of medicine in fictional shows. As many people don't spend much time in hospitals until they're seriously injured, and may seldom even go to the doctor, they depend heavily on media portrayals of medicine in forming their expectations of the medical profession. Turow contends that people get far more of their information, or at least their implicit knowledge, from fictional shows than from the news, which has been the traditional focus of media researchers. This means that how doctors and patients are portrayed on TV can have an effect on real interactions in the healthcare system. Of late, he's also developed an interest in forensic shows, like CSI, because the forensic doctors are often portrayed as infallible in a Kildare fashion, even as their clinical counterparts on Grey's Anatomy and ER are seen as quite human and error-prone.
Narrative Concepts
- This post is gonna be super-long, but it's intended to summarize all the narrative pieces—a whole segment of this week's reading.
- I'm cribbing unashamedly from a lit review I did some time ago.
- It focuses on illness-narratives research (INR, for short, in this post), or narrative in health communication contexts, rather than narrative more generally.
Bruner (1990) ultimately proposes that the human brain is hardwired for processing, creating, sharing and storing narratives, and that narratives serve as a basic mode through which individuals organize experience. He further discusses how this notion creates a fascinating role for narrative at the systems level and even the cultural level (Bruner, 1986; 1990). Narratives are inscribed in cultural artifacts, from books to DVDs to Post-it notes. Entire societies share myths and story genres with ancient roots.
Some anthropologists have found a great deal of synergy between these ideas and the notion in anthropology of the cultural life course. The cultural life course has become a prominent construct in anthropology, employed in the description of everything from infertility (Becker, 2000) to aging (Fry & Keith, 1982) to the experience of adolescence in Pacific Island societies (Rubinstein, 2001). It is the idea that every society has culturally constructed views on what constitutes proper individual development and how people should progress through life over time. While biology sets limits on the sorts of behaviors and social roles people can perform at different ages, each culture has a distinct expected “life course,” with different stages, and different details concerning individual achievement and the roles people should fill at each stage.
In American culture, the life course is reified continually. Its assumptions are apparent in advertisers’ direct mailings based on age demographics, promotion policies based on marital status, whose pictures are on the home-loan and retirement advertisements at the local bank, and countless other cultural messages. Life is even a board game from Milton-Bradley, in which children learn what sorts of social roles are acceptable at different phases of existence. Gay Becker (1997) also points to popular metaphors, such as “life is a journey,” that express normative notions of progression through life (p. 7).
Anthropologists are keen on using the cultural life course to unearth our deeply rooted cultural norms and assumptions. One particularly good way of doing this is to examine what happens when these norms are breached, and this is a primary focus of INR. Severe acute and chronic illnesses are good focal points for this research because people who fall ill with these often default on their social roles and obligations. They run afoul of the cultural life course, often in spite of the fact that they are fully invested in its normative assumptions.
Becker is one author who has done a particularly good job of unifying anthropological notions of the cultural life course and Bruner’s style of narrative theory. She takes the cultural life course to be much like a myth, a story pervasive in our culture, with which we are all implicitly—and often explicitly—familiar (Becker, 1997). In other words, the cultural life course is a narrative in Bruner’s sense. Becker (1993; 1994; 1997) terms the break that ill or traumatized individuals experience with this life course disruption. The notion of disruption is a common one across INR, but a common term for it has not yet been established.
Bruner (2002) terms it peripeteia, after a similar notion from Aristotle (p. 4). Others shy from expanding the academic lexicon and provide qualitative descriptions that amount to the same thing. Cheryl Mattingly (1998) refers to disruption as “ruptures from the normal course of events” in life (p. 107). Sociologist Arthur Frank (1995) calls it “the ‘loss of destination and map’ that had previously guided the ill person’s life” (p. 1). Arthur Kleinman (1988) describes it as the state of being “shocked out of our common-sensical perspective on the world” (p. 27). Regardless of the term or description employed, disruption is a pivotal concept in INR.
Another essential common element of this research perspective is what follows disruption. As Kleinman (1988) describes it, after a disruptive event, “we are then in a transitional situation in which we must adopt some other perspective on our experience” (p. 27). INR, and many narrative theorists, assume that human experience is disorganized and chaotic and that narrative is a tool for organizing our experience (Bruner, 1990; Mattingly, 1998, p. 107). As Kleinman describes it, humans employ narrative “to make over a wild, disorderly natural occurrence into a more or less domesticated, mythologized, ritually controlled, and therefore cultural experience” (quoted in Kirmayer, 1993, p. 162).
The topic of narrative as an organizing tool for experience is an area where different traditions in narrative theory differ somewhat. Bruner’s (1986) theoretical tradition draws a distinction between different methods by which humans organize experience, taking narrative and logic to be complementary ways in which humans organize and process information (pp. 11-43). Fisher’s (1987) narrative theory, on the other hand, views logic as subsidiary to—or even embedded in—narrative rationality (pp. 24-54). Thus, INR, in accepting the validity of logic as a mode of organizing experience, is in fact saying that narrative is the more important organizing tool in disruptive situations, as opposed to the only available one.
In narrative theory, individuals make sense of their lives by telling stories. Charlotte Linde is another author whose work is an intellectual root of illness narratives research. In her book, Life Stories, Linde (1993) maintains “narrative is among the most important social resources for creating and maintaining personal identity” (98). By telling stories about our lives and our past actions, we express a rationale for our actions and identify ourselves with the values we want others to associate with us. Every person is differently situated in society, expresses different goals in life, different visions for the future, and gives different reasons for what they have done in the past—all of which is to say that each person has a unique narrative structure for their own life. In short, according to Linde (1993), narrative is a tool for identity management, through which we not only manipulate our appearance to others, but also create our own sense of self (pp. 98-126).
Further, Linde (1993) asserts that this is a collaborative effort. In social interaction, people help each other to manage identities by recalling stories that reinforce or even manipulate this sense of self (pp. 98-126). For instance, a would-be grandmother might ask her daughter to recall a narrative frame used in the past: “Do you remember when you were twelve and used to tell me you wanted to have children someday?” This social aspect of narrative further elucidates how social norms form around the narrative an individual projects. A person who breaks with the narrative she has sought to convey does so at the expense of the social relationships that have grown up around this projected identity.
According to INR, individuals tend to view past and future events in their lives as connected in a sensible way by a narrative thread that draws its purpose both from the cultural life course and from more individualistic narratives, also formed in social contexts. When a disruptive event occurs, the seemingly natural order and purpose of these events is destroyed; they no longer appear to lead logically toward any desired end. Frank (1995) terms this to be a state of “narrative wreckage” (p. 53). In response, individuals go about a process of creating new narratives for themselves, which stitch together the now-seemingly disparate events of their lives in a sensible way. As Frank (1995) puts it, new “stories have to repair the damage that illness has done to the ill person’s sense of where she is in life, and where she may be going. Stories are a way of redrawing maps and finding new destinations” (p. 53).
Becker (1997) refers to this phenomenon as the way in which “the corpus of our individual histories is brought together by a work of imagination that, in articulating the various points of connection, transforms it into a coherent story” (p. 26). For the individuals who engage in it, this is a process fraught with difficulty. It involves both making sense out of the devastating experience of disease on the one hand and forming new social relationships to replace those that, sadly, often break down in the face of illness.
People constructing new narratives for themselves are likely to need new information, thus the narrative literature has the potential to tie in nicely with the existing health/risk literature on information seeking and processing. I took a shot at doing this in the original lit review, but I digress—I've written more than enough for our blog's purposes.