Erowid
 
 
Plants - Drugs Mind - Spirit Freedom - Law Arts - Culture Library  
Donate BTC or other Cryptocurrency
Your donation supports practical, accurate info about psychoactive
plants & drugs. We accept 9 cryptocurrencies. Contribute a bit today!
Experience Report Reviewing
The Good, the Bad, and the Ugly
by the Erowid Reviewing Crew
October 2002
Citation:   Erowid. "Experience Report Reviewing: The Good, the Bad and the Ugly". Erowid Extracts. Oct 2002;3:18-20.
The Experience Vaults are one of the most popular parts of Erowid, cumulatively getting well over 30,000 page views per day. They provide the closest thing we have to public forums where visitors can submit their ideas and opinions for public display.

Because experience reports by definition are subjective, they are put through less fact-checking than many other types of articles on Erowid. But they do go through a lengthy process of approval. The two most common questions we receive about the Experience Vaults are whether the reports are checked or reviewed at all, and why a particular submitted report has not yet been displayed on the site. This article is an introduction to the Erowid Experience Vaults, how reports are chosen for publication, and what type of reports do not make it onto the site.

Initial Impressions, by Scotto
As Earth explained to me how to benchmark the average report, I was surprised to realize that the average report was of considerably poorer quality than I'd expected. It's easy to be arrogant and judgemental, especially because I consider myself a writer, but that was what I had to get over. I had to make a distinction between my perception of the quality of the writing versus the potential value of the actual content of the writing. It took a while to get used to.

I think the most dispiriting aspect of reviewing is how much destructive use gets reported as if it were wonderful recreational use. On the one hand that's also me being arrogant and judgemental; clearly there's a line between what some consider acceptable and what others consider destructive. But by the same token, in some cases I can clearly see harm in the patterns of use described, if only by my own standards, and it's challenging to remain relatively unbiased about evaluating the worth of the report in those cases.

The flip side is that I still see people having remarkable experiences, often in reports from those who are at the earliest stages of their psychedelic career. When I read reports by 19-year-olds and witness the sense of wonder they're experiencing, or the confusion, it gives me the opportunity to see my own experiences with a fresh eye. It helps remind me to take a step back from how jaded I usually am, to not be so "been there, done that." There are still a lot of people out there who are stumbling into this enormous world of psychedelics for the first time. When I was 19 years old and first started taking psychedelics, I instinctively looked for answers to difficult questions that the experiences raised. Our college library had a couple of Tim Leary books, which are definitely not beginner-level texts. Alt.drugs was a wasteland back then. It's still a wasteland, but there's so much more than Usenet now. It's exciting to think that in a relatively short time span, resources such as Erowid have developed to offer something more concrete to new users than I was able to find during my relatively isolated novice period.

Another thing reviewing experience reports constantly reminds me is that it's not easy to do the work of writing up experiences. It's really easy to have a weekend trip that seems remarkable and then by Monday be too busy and tired to write about it. There seems little incentive. I often ask myself, "What is there novel to say about my experience?" It really does get challenging to force yourself to try to write it down.

The Mission
From the start it's been obvious that experience reports are an integral part of the data about psychoactive substances. If nothing else is known about a plant or chemical, a lot can be learned from a few well-written reports of their use. The design goal of the Experience Vaults is to act as a categorized repository for the long-term collection of people's experiences with both psychoactive substances and techniques, and to make those experiences easily available to people searching for information about reported use, effects, problems, and benefits. Our editorial goals are to weed out completely fraudulent entries and to keep the texts focused on the firstperson experiences of the authors. The vision that keeps the project moving is one of 100,000 reports on a thousand different substances or techniques, all categorized, rated, and searchable as part of the public knowledge-base.

The Past
From 1996 through mid-1998 the "system" we used for publishing experiences was simply to request permission to use reports that we found on email lists or web boards and to ask specific individuals to write up their unusual experiences. In 1998, we created a simple web form for the submission of reports, which forwarded the stories to us by email. While this had the advantage of allowing anonymous submissions, it quickly became burdensome, as it was our policy for both Fire and Earth to read each report before we would publish it on the site (in handcoded HTML). Eventually, we started to accumulate a large backlog of reports with no way to allow other crew members to review them while still maintaining oversight of the collection.

The Present
The third generation of the Experience Vaults -- launched in June 2000 and still in use today -- introduced a much more formalized and improved review system that has allowed us to publish more than 3,500 reports in the last two years. The Erowid experience admin system allows Erowid crew members to review incoming submissions, categorize them by substance and type of experience, then edit, rate and approve them. The primary principle of the design is that at least two knowledgeable and trained reviewers read each submitted report before it is considered a permanent part of the archive.

When a first-stage reviewer approves a report it becomes publicly viewable, but it also enters a list of reports awaiting secondary approval. If a second-tier reviewer also approves the report, it is considered to have received "final" approval and becomes a permanent part of the collection (although it can always be taken down by a site admin). If, on the other hand, a first level reviewer "trashes" a submission, it does not get displayed but instead enters a list of reports awaiting secondary "trashing". When a report is trashed by a second tier reviewer, it is permanently deleted.

The Problem
It's not obvious at first glance how challenging and time consuming it can be to review incoming reports. It takes time to read a full text, determine whether to approve or reject it, then set all relevant categories and ratings. Generally the better a report is, the easier it is to review. While some reports are well written and a joy to read, it is much more difficult to decide what to do with the other 80%.

With each report that comes in, we feel a strong sense of obligation to honor the energy and time that the author took to write up their experience and submit it to us. Even--or perhaps especially--when reports are badly written, or describe types of use that seem less than ideal, it can be draining to decide that someone else's story isn't worth publishing. And yet, as publishers, it's our job to examine incoming reports and make educated decisions about how to apply a set of reasonable criteria for inclusion or exclusion. There are Experience Report Reviewing guidelines which spell out the general parameters by which reviewers judge submitted reports.


"While I am rather forgiving with regard to spelling and style, if I have to work to make sense of a report, I am likely to delete it."
-- Scruff

 
Accept or Reject: We reject reports we believe to be falsified, reports which are impossible to read because of bad grammar or spelling, reports which have no content related to any topic we cover, and reports which consist only of a litany of activities engaged in while high but don't address the effects of the substance. Nearly half of all submitted reports fall into one of these categories.

Report Rating: Reports are assigned an overall rating ("Amazing" to "Marginal"), which determines where they show up in the lists of publicly displayed reports, and whether they are listed as "Erowid recommended". Less than 1% of displayed reports receive a rating of "Amazing" and another percent are rated "Very Good", while most reports are rated in the "Average" range.

Most Common Reasons Reports Are Deleted
Weak in Content   Reports that contain very little beyond a mention of the drug a person took and then a description of what they did: "We drove around in Bud's car, then we went to the mall, then we walked to the quickie-mart, played video games, and everything was really bright and we laughed a lot."

Not Credible   Occasionally we get a report that just sounds utterly implausible. It's impossible to tell with what frequency spurious reports are submitted, but a reviewer can often get a sense of whether a report is completely fabricated.

Difficult to Read   Many of the reports we receive are incredibly difficult to read. Some are submitted in all capital letters (or with no capital letters) with no punctuation or paragraph breaks, others are such spelling and grammar disasters that they are completely unreadable. These sorts of reports are generally rejected immediately. Reviewers have a lot of leeway if they feel a writer's unusual style is artistic, but the basic rule is that if a flexible, collegelevel English reader can't make sense of it, it's not appropriate for the Vaults.

Very Uninteresting   Reports that are extremely redundant and offer no real interest, data, or color to the world. There are a lot of below-marginally written reports, about common substances, that we decline to publish.

What Was That?   It's not always clear from a report what substance a person actually took. While we do have a category for reports about substances that turned out not to be what the author expected, not being able to identify what substance was ingested makes a report virtually useless, and therefore these are usually deleted.

There is also a rating called "Cellar". If a report is considered unfit for display, but contains some tiny bit of relevant data that we don't want to lose, it is relegated to the cellar, undisplayed but still available for internal research. Examples would be a report that mentions hospitalization but provides no verifiable details or contact information; a very poor quality report which describes a reaction or effect we haven't heard of; or a report of a rare combination of substances that we don't find credible.

Rating is necessarily a subjective and highly personal process and it can be touchy to grade other people's writing. At this point we choose not to clearly display report ratings because we are aware how seriously some people take this type of judgement.


"I've been called a masochist for reading reports as avidly as I do. I've been fascinated with reading and hearing others' experiences since I was a teen."
-- Scruff

 
Text Editing: Some editing is done on the text of most reports, although we don't generally fix minor spelling and grammar errors. Our policy is to fix only a few errors per report in order to retain a strong sense of the author's writing skill and style.

We feel that artificially polishing each report would sanitize the incoming data, making it hard to identify the original voice of the author. Often the diction, style, spelling, and grammar of a report are all one has to get a sense who the author is, what their level of education is, who they are. These stylistic issues, which can often be distracting to read through, are also very much part of the data of a given report. Preserving the voice of the author both helps to capture the range authors who write the reports and provides additional cues for choosing how much weight to give the content of the report.

* * * * *

Unfortunately, the level of care our review crew strives for makes it difficult for us to keep up with the number of incoming reports--currently a steady 25 per day--so we are constantly falling behind. Early in 2002, Sophie lead a charge to clear out pending reports more than a year old, and succeeded. But even weeks of nothing but experience reviewing only caught us up to last year's submissions. There are thousands of reports which have never been read. When people inquire why their report hasn't been posted to the site, we sheepishly have to respond that we are doing the best we can to work through the submitted reports, but that the project is critically understaffed and underfunded.

Experience Vault Statistics
Total Reports Submitted 17,396
     Submitted Each Day 25
Total Reports Reviewed 10,719
     Published 4,527
     Declined (1st pass) 1,782
     Declined (permanent) 4,290
     On Hold 120
Total Awaiting Review 6,797
Oct 2002
The solution to the problem may seem to be simple: finding more people to do the reviewing work. But people who volunteer to review reports often imagine the process to be much more fun than it is. Unfortunately this leads to most volunteers quitting before they begin. Even with a rigorous application process to weed out those who aren't serious, along with a request that people commit to reviewing at least 40 reports before giving up, less than half who agree actually complete 40 reports. Since it takes more than an hour to train someone in on the interface and then another few hours of oversight by a second-tier reviewer, finding committed reviewers can be a burden on the busiest of the crew.

The Future
The primary problem with our current reviewing system seems to be the period of time it takes to train new reviewers, combined with a difficult and somewhat tiresome process that loses the attention of casual volunteers. A fourth generation system, being designed to help resolve some of these problems, includes a triage system for incoming reports that will incorporate a significantly simpler interface for use by casual volunteers. This will be used as a first stage to pre-sort and provide basic categorization for incoming reports, which will then move on to full review by the crew.

* * * * *

Through the work of a few dedicated reviewers, the Experience Vaults have grown into a valuable public archive. Scruff has been an amazing reviewer and has processed more than 2800 reports. Sophie has reviewed over 1700 reports in the last year and MorningGlorySeed has reviewed 450. Other long-term review crew include, in order of number of reports reviewed, Tacovan, Erica, Shell, Catfish, Desox, and Scotto. It is only through the sustained efforts of these committed individuals that the project is able to thrive.

We cannot express enough our appreciation to those visitors who have taken the time to write and submit quality reports and to those reviewers who have tromped through the seemingly endless quagmire of human folly, searching to unearth gems of insight, clarity or error, to be added to the public record.

Meme Cultivation
Describe Your Experience, Not Mine
Although the general rule for editing experience reports is to change as little as possible of the author's language, one of the primary changes reviewers are encouraged to make is to modify second person and "didactic" language. Although this does change the voice of the author, we feel strongly that there is value both in encouraging people to think and write in terms of their own experience, and in not telling others what to think, feel or do. The Experience Vaults are intended for descriptions of experiences, not for broad treatises on the use of psychoactives.

The first part of the policy is to adjust instances of 2nd-person language where the pronoun "you" is used. An example would be changing "Mescaline gives you body tingles" to "Mescaline gives me body tingles".

While there are certainly phrases and uses of "you" that are acceptable--and a reviewer will leave such sentences intact if they're not directly about personal experience or are crafted with skill and intention--projecting one's personal experiences onto everyone else in the world is a common error that Erowid is keen to discourage.

The second part of this policy is the removal of overly didactic (lecturing) text. Some authors fill their reports with broad conclusions about how others should act based on their own experiences, experiences which may not even be described in the report. When an author uses didactic language like, "first time users should always..." or "remember to always...", it's time to edit the text to reflect that person's unique experience rather than their assumptions about what others should experience.

We work to make sure we don't remove the personal lessons or insights that an author is trying to impart, but instead rephrase them as exactly that--the insights of an individual.