Advertisement

Indies and Behemoths: A Decade of Extremes

Share via
BALTIMORE SUN

Two seminal events of 1989 helped set the course for film in the 1990s.

First, a talky, micro-budgeted film by an unknown director won the audience award for best film at the Sundance Film Festival. It was subsequently bought by a little-known outfit called Miramax Films, ushering in the era of the independent film. “sex, lies and videotape,” the little-engine-that-could, became more powerful than a speeding locomotive as Hollywood set out to find the next Steven Soderbergh.

That same year, Sony Corp., the Japanese consumer electronics giant, purchased Columbia Pictures, ushering in the era of vertical integration, corporate values and synergy. Suddenly, the same corporate parent owned a film production, distribution and exhibition company, not to mention the print and television media that flogged the products.

The consequences of both events played out in the 1990s. Hollywood has become increasingly corporate even while luring the latest hot young independent talent. Indeed, with mavericks like Miramax either coming under some other studio’s corporate umbrella or disappearing altogether, the very term “independent” has come to mean less and less.

Advertisement

But that doesn’t mean that first-time filmmakers don’t blast out of nowhere with a film that throws audiences for a loop and turns Hollywood on its ear: Mr. Soderbergh, meet the “Blair Witch” boys.

Nicely bridging the two trends, computerization has had as huge an impact on the movies as it has on every other aspect of our lives. In some ways it has added to the synthetic feeling of the worst of corporate product; in others it has put the means of production and marketing more easily in artists’ hands, making good on Francis Ford Coppola’s prediction that even a “fat girl in Ohio” will someday be able to make a movie.

Digital technology may even change the way we see movies in theaters, with personal computers promising to become theaters in themselves. But will people trade the communal theatrical experience for more hours of being alone with their terminals? It’s difficult to imagine the day when the word “download” carries the same magical power as “roll ‘em.”

Advertisement

For now, here’s a look at how the 1990s changed movies and how we watch them:

1991: “Terminator 2: Judgment Day,” is released, starring Arnold “I’ll be back” Schwarzenegger. But the real star of the movie is the evil cyborg T-1000, played alternately by actor Robert Patrick and his computer-generated doppelganger. For the first time, a digital character is just as convincing as his human counterpart. Morphing, which has previously been seen in the ill-fated “Willow,” becomes a movie staple.

Two years later “Jurassic Park” finishes what “T2” started, convincing audiences and the industry that digital images can be scary and convincing even in close-up. Two years after that, “Toy Story,” the first computer-animated feature-length film, makes the digital revolution virtually complete.

1992: A former video store clerk from Los Angeles arrives at the Sundance Film Festival with his first movie, “Reservoir Dogs,” whose stylized violence, pop-culture references, hip soundtrack and ensemble cast of great character actors startles critics and audiences. Soderbergh’s heir is anointed. Two years later Quentin Tarantino fulfills that early promise with “Pulp Fiction,” a sprawling post-modern epic that makes $100 million at the box office.

Advertisement

Meanwhile the fractured narrative structure, manic dialogue and speedy editing of “Pulp Fiction” inspire thousands of imitators (for good and for ill), reignite the flagging career of John Travolta, ensure Miramax’s status as a Hollywood player and introduce the concept of independent films crossing over to a mainstream audience.

1995: In Dallas, AMC Entertainment opens the Grand Theatre, a 24-screen theater that introduces the term “megaplex” to the American vernacular and becomes the first theater in the country to feature stadium seats and ceiling-to-floor, wall-to-wall screens. The race is on to build theaters with the most screens and amenities (the biggest today is the 30-screen, 5,924-seat Mesquite Theatre in Dallas).

Although the Grand promises to play independent and foreign fare, it quickly abandons smaller films to play only Hollywood products. Independently owned theaters and art houses increasingly become a thing of the past.

Also in 1995: For the first time, Hollywood’s international gross box office revenue matches its annual domestic receipts, suggesting that the era of ancillary markets is upon us. Audiences would notice an uptick in movies featuring less dialogue and more action, as well as Happy Meal tie-ins, as a movie’s success in American theaters takes a back seat to such lucrative areas as overseas release, home video, television rights and merchandising.

1996: A hip, ironic take on the slasher movies of the 1970s and 1980s, starring a bevy of young, attractive television stars, wows its teen audience and reminds anyone that culture consumers are living in a pediocracy. Made on a bare-bones budget but with just the right amount of attitude, “Scream” launches a flotilla of sequels and imitators, including “Scream If You Know What I Did Last Halloween” (currently renamed “Scary Movie”), an ironic take on the slasher movies of the 1990s.

1997: African Americans make significant strides on screen and off, as Will Smith becomes the star to beat in “Men in Black” (hot on the heels of “Independence Day” a year earlier); actor Samuel L. Jackson produces “Eve’s Bayou,” one of the most highly acclaimed movies of the year; and record producer Babyface Edmonds produces “Soul Food,” a warmly observed drama-comedy about a black family in Chicago.

Advertisement

1998: Internet surfers find a Web site devoted to an obscure legend about a Maryland witch, and a full-blown cultural phenomenon is born. Daniel Myrick and Eduardo Sanchez, who filmed “The Blair Witch Project” in rural Maryland with little money but loads of imagination, had no idea that their tiny movie, filmed mostly on video with three unknown actors, would have a chance to be picked up by a major studio. So they create the ingenious site to drum up grass-roots interest. They succeed.

By the time “The Blair Witch Project” hits theaters in the summer of 1999, the want-to-see factor has long surpassed mere hype. Now one of the most profitable movies of all time, “The Blair Witch Project” brings back the idea that what’s in filmgoers’ heads is much scarier than the most graphic monster. And it puts the ballyhoo back in film marketing. It also proves that a kid with a video camera and access to a computer can create his or her own media frenzy.

1999: Theater owners from around the nation witness digital exhibition for the first time at an annual meeting in February. Clips from a computer hard-drive are shown beside the same images projected from film, to mostly good reviews for a steady, bright and sharp image. But reports of a digital revolution may be overstated. Theater owners and studios are still squabbling over who would pay to retrofit theaters to accommodate the new delivery systems, and enterprising film stock companies and entrepreneurs are busily coming up with a more sophisticated celluloid product that would make digital a nonstarter. In any event, within the next few years, audiences can anticipate seeing movies in a whole new light.

Advertisement