In Defence Of ‘Peak Sequel’ Capitalism

Is Hollywood running out of ideas?

In the wake of the unsurprising success of the seventh iteration of Star Wars, it can’t have escaped anyone’s attention that the American film industry is now pouring out sequels and reboots and exploiting established franchises.

This year we’re going to get Zoolander 2, My Big Fat Greek Wedding 2, Kung Fu Panda 3, Batman v Superman, Finding Dory, Captain America: Civil War, X-Men: Apocalypse, Now You See Me 2, aGhostbusters reboot, a fifth instalment of the Jason Bourne series, Bridget Jones’s Baby, another Jack Reacher movie, another Independence Day, a sequel to Bad Santa, and of course the next Star Wars film.

After 2016, there’s another Indiana Jones in the works, at least one more Alien, another American Pie, moreAvatars, another Blade Runner, a Die Hard prequel, another Frozen, and apparently a Star Wars every year until we die. Dominic Knight has dubbed this “peak sequel”. By one count there are 156 sequels in the works.

So it’s easy to be pessimistic about the imaginative vibrancy of Hollywood. One influential essay in GQ in 2011 forecast the “(potential) death” of American film as an art. There’s a helpful infographic floating around on “Hollywood’s waning creativity”.

But there is every reason to look at Hollywood’s sequel, franchise and reboot fashion with optimism, even admiration. They are a symbol of cultural health, not stagnation.

First, the situation is not exactly as it looks. While there are more sequels there are also a lot more movies, as trade sources in the US and UK complain. Don’t like the flashy pop juggernaut of Star Wars: The Force Awakens? Go see the bleak Revenant, which just won the best picture Golden Globe.

Anyway, adaptations and franchises have been Hollywood’s game since the very beginning. Cinema has always dug through and repurposed other cultural products. One of the earliest, greatest films, the 1902 silent A Trip to the Moon, is a mixed adaptation of stories by HG Wells and Jules Verne.

In the golden age of studios, filmmakers happily converted popular novels into film. By my count, at least 15 of the 20 best picture Oscar winners between 1950 and 1969 are adaptations of novels, plays and musicals. These were sometimes very well known, including Oliver! (a film adaptation of a musical adaptation of Charles Dickens’s novel); My Fair Lady (an adaptation of a musical adaptation of a film adaptation of the stage play Pygmalion); and Ben Hur (a reboot of a 1925 adaptation of an 1880 novel that had been made into a play in 1899 and a 1907 film). Possibly the best American film is a sequel of an adaptation: The Godfather Part II. Look at how derivative the Internet Movie Database’s top 250 movies are.

It’s not clear how adapting well-loved and established stories for film is substantively more creative than adapting well-loved and established film stories for more films. What standard of creativity does the all-female reworking of Ghostbusters violate that West Side Story (a film adaptation of a musical adaptation of Romeo and Juliet) did not? It would be weird to complain we’re getting too many Shakespeare reboots. For what it’s worth, The Revenant is an adaptation of a novel too.

I made the point before Christmas that all culture relies on appropriating from earlier culture. George Lucas’s 1977 Star Wars was boldly original, but was also a complex pastiche of narrative tropes and imagery.

But there are stronger arguments for a Hollywood full of franchises than everyone-does-it and if-you-don’t-like-it-go-see-something-else.

As TV shows become more film-like – with higher production values, and longer stories that stretch across an entire television season – franchising means films are becoming more TV-like. Imagine those endless Marvel films (Iron Man, The Hulk, The Avengers, Thor, Captain America and so forth) as episodes in a long running story, rather than standalone movies. Like any show there are better episodes and worse episodes, but in sum they add up to a stronger whole than each individual would be.

Star Wars is a great example of how franchises can enrich a culture rather than shrink it. Everybody but the most contrarian agrees that Lucas’s three Star Wars prequels, released between 1999 and 2005, mostly fail as individual pieces of dramatic entertainment. (Yes, OK, an arguable exception is 2005’s Revenge of the Sith, sure.) But as exercises in constructing a rich and deep fictional world, they are remarkable. Sequels and franchises allow filmmakers and audiences to mine further veins of potential stories. No character need briefly appear on the screen and disappear forever. There’s always the opportunity for a spin-off.

Audiences clearly want this. The fanfic subcultures which pop up around every major film reveal an audience eager to further immerse themselves in the fictional universe. Books, comics, and TV specials are released to add depth for those who want more. The Star Wars expanded universe offers audiences a map of the long-ago, far-away galaxy with its own traditions and tales. The Marvel Cinematic Universe has its own comics, short films, TV shows, and the enormous back catalogue of stories and characters dating to the Second World War.

Even all that tacky merchandising that consumers lap up is a sign of cultural engagement. They want to take the film experience home with them. Surely this is what we want from culture – a communal experience, shared stories, imaginative worlds.

Movies have always been the most explicitly commercial art form. If you view art and commerce as distinct, separate spheres then it must be tempting to view ‘peak sequel’ capitalism as displacing the original visions of genius auteurs with repetitive dreck. But art surely has to speak to people. These grand worlds being built by sequels and franchises are doing that. They should not be regretted; they should be embraced.

Is Cultural Appropriation The Bogeyman It’s Made Out To Be?

A spectre is haunting the planet: the spectre of cultural appropriation.

To appropriate symbols from cultures that are not one’s own is apparently now disrespectful, insensitive and offensive.

A student body at the University of Ottawa has banned yoga classes as an example of “cultural genocide” and “Western supremacy”. Student unions at the University of East Anglia have targeted Mexican sombreros for “discriminatory or stereotypical imagery”.

At Oberlin College in Ohio, it is food that is problematic. The student dining hall is accused of modifying “traditional” Asian recipes “without respect”. The “undercooked rice and lack of fresh fish” offered in sushi “is disrespectful”. The Banh Mi sandwich, served on ciabatta rather than a baguette, is “uninformed”, a “gross manipulation” of this “traditional” Vietnamese dish. And the General Tso’s chicken dish is prepared with steamed chicken, rather than fried chicken – another disrespectful appropriation.

These complaints are apparently serious. They could just as well be satire. Because each of those named foods are themselves the result, not the victims, of cultural appropriation.

Sushi has an ancient history in Japan but what many people in Japan and the West now see as good sushi – with its rich slices of tuna and salmon – is the result of Japanese chefs adapting their traditional dish to the tastes of American GIs during post-war occupation.

The Banh Mi is a fusion dish of French baguette – brought to Vietnam through French colonialism in the nineteenth century – and Vietnamese flavours.

And General Tso’s chicken? It dates back, at the earliest, to the 1950s, has nothing to do with the nineteenth century general Tso Tsung-t’ang, and only became famous when it was first served in a New York Chinese restaurant.

Sure, it’s easy to mock a few uninformed university students. So let’s continue.

The sombrero comes not from Mexico, but was brought from Europe by the Spanish – Don Quixote is often depicted with a flat-topped Spanish sombrero. The sombrero was then culturally appropriated by early American cowboys and evolved into their distinctive cowboy hat. For their part, the Spaniards got the sombrero from the Mongolians.

Modern yoga is so far from the ancient Indian tradition that it is better seen as a totally separate endeavour. The typical modern yoga fitness class draws on gymnastics, calisthenics and Indian wrestling. Its relationship to the fourth century Yoga Sutras of Patanjali is like the relationship between the cowboy hat and the Spanish sombrero: related but far enough apart to be considered substantively different.

Why is this important? Because the history of culture is the history of cultural appropriation. What we see as traditional national or ethnic cultures today are the just the current manifestation of a long evolutionary process. Traditional foods, religions, dress and practices are constantly changing as they are exposed to other cultures, picking up and integrating the most appealing or adaptable parts.

In her important 2013 book, Cuisine and Empire: Cooking in World History, the food historian Rachel Laudan documents the many ways so-called “national cuisines” are almost always an amalgam of foreign influences, incorporating plants, animals, techniques, spices and styles that have been pushed around the globe by politics and economics. There are no “authentic” cuisines, no “traditional” foods. Everything is fusion.

The same story could be told for language, architecture, dress, religion, music, art, literary culture and on and on and on.

So the issue here is not just that the criticism of cultural appropriation is historically illiterate. It’s deeply ironic. The critics of cultural appropriation claim to be progressive. But they are in fact engaged in a deeply conservative project: one which first seeks to preserve in formaldehyde the content of an established culture and second tries prevent others from interacting with that culture.

Appropriating other cultural symbols is not empty, cynical role-playing, it is development. By appropriating we add meaning, creating complex new rituals and relationships.

Take, for instance, the most prominent example of cultural appropriation and evolution in the modern West: Christmas.

It is well understood that Christmas is an amalgam of Christian beliefs and Pagan rituals. The Christmas tree comes from Germany, Father Christmas from England, and Christmas carols from Roman-era Christian hymns. Most people would class candy canes as one of the secular icons of Christmas but they may have been meant to represent the shepherd’s staff.

To observe a nativity scene (a first century AD stable in Bethlehem) next to a Christmas tree (an evergreen winter climate plant) is to see that there is a lot going with this apparently simple holiday.

Why is gift-giving part of the way we celebrate of the birth of Jesus Christ? Not solely because of the Three Wise Men. We might as well ask why Jewish families in the United States enjoy a plate of General Tso’s chicken on December 25. Lots of reasons.

Cultural evolution is like that: a contradictory, rich, unstable mix of tradition and change. To attack cultural appropriation as offensive, or insensitive is to attack culture itself. And just as absurd.

Lego Can Avoid Ai Weiwei, But It Can’t Avoid Politics

On Friday the Chinese dissident artist Ai Weiwei revealed that Lego had refused a bulk order of bricks from his studio. The bricks were to be used for a piece that he was going to show at the National Gallery of Victoria.

Lego says it has a long-standing policy to not knowingly supply its bricks for political uses. Yet there might be something else going on here. In an Instagram post, Ai drew a connection between Lego’s action and the recent announcement of a new Legoland to be opened in Shanghai. He later described Lego’s actions as “an act of censorship and discrimination”.

On the one hand, this ban means nothing in practice. The company may not approve of using its product for political works but Ai does not need Lego’s approval. There’s nothing stopping him from buying new Lego kits from retailers, rather than from Lego directly, then doing whatever he likes. If that fails, there’s a thriving global second-hand market for individual Lego pieces. And the artist has apparently been “swamped” by offers of donations of Lego since Friday.

When Lego declined his order the firm was no more engaged in censorship than was the Brisbane bookstore that refused to stock Campbell Newman’s biography as retaliation for cancelling the Queensland Premier’s Literary Awards.

On the other hand, what we’re seeing here is a toy company struggling with the political implications of its own enormous cultural profile. Lego is a very particular toy company.

The global toy industry is dominated by a few big players. Mattel and Hasbro are two of the largest umbrella firms. Almost every toy brand and product you can think of – Fisher-Price, Barbie, Power Wheels, GI Joe, Mr. Potato Head, Transformers, Jenga, Monopoly, Battleship, Cluedo – falls under one of those two, giant publicly listed companies.

But last year the privately held Lego trumped Mattel and Hasbro to became the biggest toy company in the world.

Unlike its rivals, Lego is based around a single, iconic product: the Lego brick. And unlike its rivals, it professes a peculiarly utopian ethic about the nature of play and creativity that very much reflects the era and place in which it was founded: 1950s Denmark. The firm is still based in the small Danish town of Billund. It is still very much animated by its founding myths.

For instance, Lego avoids making realistic military kits or weapons because its founder, Ole Kirk Christiansen, didn’t want to make war seem like child’s play. Star Wars branded Lego has been central to the firm’s recent success. But as David C Robertson points out in Brick by Brick: How Lego Rewrote the Rules of Innovation and Conquered the Global Toy Industry, Lego nearly passed on the Star Wars license because “the very name … was anathema to the Lego concept”.

Robertson’s book leaves you with the impression of a company struggling to come to terms with the way Lego has been repurposed and reimagined by its own consumers.

In 2010 the firm reported that about 5 per cent of its sales come from adult consumers buying for themselves. This is certainly an understatement, given Lego’s growth since, the Lego Movie, and the fact that some parents are choosing Lego for their children partly for self-interested reasons.

Large scale Lego sculptures are a minor pop culture genre. Lego profits from this: the Architecture line, marketed to adults, taps into the ways consumers have been using the pieces unintended by Lego’s marketing team. That Ai Weiwei wants to use Lego for art is a reflection of its cultural symbolism. Ai is not a pioneer here. There are artists who work exclusively in Lego. Hobbyists make elaborate creations. There’s a rather incredible Battle of Waterloo.

Yet Lego is not a company well-geared for political controversy. At first glance their policy on controversial uses of their product is sound and clear. No politics, no religion, no military. Chinese democracy activists won’t get Lego’s approval, but then nor will Klu Klux Klan members. Lego wants to remain above the grubby material concerns of politics.

Such anti-political neutrality is obviously impossible. Whether they like it or not, Lego is a player in the cultural life of the human species, and in a way that any of Mattel and Hasbro’s competing brands are not. Lego profits handsomely from that status. Perhaps a truer form of political neutrality would mean paying no attention to the ultimate use of bulk Lego sales.

I suspect the refusal to fill Ai’s order is more a case of mindless adherence to their no-politics policy rather than a sop to the Chinese state. But if it is the latter, with this controversy they’ve found themselves in the invidious position shared by firms around the world who want to service markets in unfree countries like China.

Such relationships throw up serious ethical questions. Refuse to abide by the state’s rules and deny their oppressed citizens a product you believe will better their lives? Or obey and hope the benefits outweigh the harm of cooperation? You can imagine the tense meetings going on right now in Billund, as news of the Ai decision snakes around the world. They’re just a toy company after all.

Milton Friedman was correct when he said that the social responsibility of business is simply to increase its profits. But ours is a fallen world. Businesses are also participants in our political systems as much as our economies.

Sometimes that means toy companies have to take a stand on democracy in China. They have to choose between the Chinese state and its dissidents. Implicitly, inadvertently, perhaps even with the best of intentions, they already have.

Islamic State Is Destroying Ideas, Not Just Artefacts

It is characteristic of totalitarian societies that they feel they need control over the past as well as the present.

So it’s hard not to see an echo of Stalin’s erasure of his former comrades in the deliberate destruction of ancient artefacts and archaeological sites by the Islamic State.

The difference being that when IS bulldoze the 3000-year-old Assyrian city of Nimrud, as they reportedly did last week, they’re not just trying to erase their victims’ history, but humanity’s history as well.

In late February the United Nations released a report describing IS persecution of Christians, Shiah Muslims and religious minorities like the Yazidis as “war crimes, crimes against humanity and possibly genocide”. IS has been murdering gay men and politically active women. It is guilty of genocidal atrocities on a historical and savage scale.

Among this human slaughter the destruction of a few antiquities might seem like a small thing. And of course it is. But it still offers a revealing window into the mindset of radical Islamism.

By now everyone has seen photographs and video of IS militants smashing up statues in the Mosul museum last month. Happily some of those were plaster replicas. Not all were.

In the last few days IS has apparently been tearing down the ruins of the ancient Iraqi city of Hatra.

The most prominent Islamist destruction was that of the Buddhas of Bamiyan – two towering Buddha statues in Afghanistan dynamited by the Taliban in March 2001.

Obviously much of the destruction is deliberately done for Western eyes. The destruction of the Bamiyan Buddhas was one of the rare times Afghanistan made headlines before the September 11 terrorist attacks.

The Taliban sent mixed messages about the purpose of the destruction of the Buddhas. Some officials claimed it was done for standard iconoclastic reasons. An Islamic state could not tolerate the image of an idol from another religion.

But a Taliban envoy to the United States offered a more prosaic, political reason: the Buddhas were destroyed because the West was only offering aid money to restore statues rather than to prevent malnutrition.

The footage of the Mosul museum and Bamiyan Buddhas was broadcast across the world. One Syrian anthropologist told the New York Times in February that “it’s all a provocation”.

And it is true that IS’s iconoclastic principles don’t apparently prevent them from exploiting the lucrative black market for antiquities.

Nevertheless, IS relishes its reputation for brutality and inhumanity. That reputation is part of its recruiting strategy. It offers foreign fighters an absolute break with, and resistance to, the Western world – an ascetic and violent Islamism that is totalitarian in the truest sense of the word. It believes in nothing except itself.

This brutality is its reason for existence. It is what makes the Islamic State, in its mind, the bona fide caliphate, rather than just another militant theocracy in the Middle East.

Last week two writers at the Daily Beast said we shouldn’t attribute this historical destruction to “militant Islam” – lots of totalitarian states try to erase the past.

This is like saying we shouldn’t blame fascism for German atrocities between 1933 and 1945.

And IS’s symbolic ambitions are greater than their 20th century predecessors. Where Hitler and Stalin sought to rewrite history, Islamist totalitarians are trying to destroy it.

Much of the destruction is taking place out of the eyes of the West. Some we only learn about through rumours and unconfirmed reports. For instance, a stunning Ottoman castle in the Iraqi town of Tal Afar has been destroyed – we think. IS has been destroying Christian monasteries, Yezidi shrines and Muslim mosques, both Shiah and Sunni, with little reaction in the West. In Mali Islamist radicals destroyed ancient libraries and tombs.

This destruction isn’t just a calculating provocation for the benefit of Western audiences. It’s ideological.

Worse than those who would downplay the role of Islamist radicalism in this arc of destruction are cultural relativists that excuse it.

Take this academic paper, which condemns not the destruction of the Buddhas but “Western Civilisation’s … fundamentalist ideology of heritage preservation”. The Taliban’s dynamite was just part of the back and forth of history. Why are we so precious? “This paper should not be read as a call for more destruction,” the author says. But, as they say, if you have to write it…

In fact, the Islamist war against artefacts and archaeology is part of a broader “cultural terrorism” being waged around the world, where the target is not an enemy but their idea of themselves.

The Charlie Hebdo killers – and all those who have threatened cartoonists and critics with murder – waged this sort of cultural terrorism as well: attacking not just people, but ideas and symbols that speak to how we understand ourselves. We think of ourselves as an open society, they try to close it by force.

David Hume believed that from the diversity of history we discover the “constant and universal principles of human nature”. By trying to destroy their own heritage, IS and other Islamists are trying to separate themselves from the world.

Melburnians Are Streets Behind On Pavement Food

For a city that boasts about its culture and cuisine, Melbourne has a serious deficiency: street food.

Australians mostly think of street food as a feature of the developing world – the slightly risky snacks available on the side of the road in Marrakesh or Hanoi.

But street food is everywhere. Food stands in Belgium and Holland sell chips with mayonnaise. Street vendors in Italy sell croquettes and arancini. Germans can pick up kebabs and bratwursts everywhere.

American cities have always had hot dog stands. Now, they are experiencing a food truck revolution – an explosion in mobile food vendors offering everything from Korean tacos to dumplings. Some of the best dining in the US is on the footpaths of the drabbest business districts.

Street food is varied and cheap.

At its best it is interesting and experimental. The trucks can quickly respond to consumer preferences and park where demand is highest. The communal nature of food-truck dining helps build social capital. Eating is about more than just sustenance.

But Melbourne is missing this revolution. Our food trucks are few; street stalls non-existent.

Melbourne’s street food is rubbish because Melbourne’s brick-and-mortar restaurants prefer it that way.

Businesses don’t like competition. Competition pushes down prices and forces innovation. Entrepreneurs are always trying to entice customers away.

And like any other industry, restaurants know the surest way to reduce competition is to have the government regulate your competitors. Every day, the City of Melbourne hosts about 800,000 people. Yet for those 800,000 people, the city council has approved space for just nine food trucks.

Also, these food trucks have to stay at specific locations. Not one of these locations is in the city centre itself, where you would think demand for food is highest. All but two are hidden in the parklands around the Royal Botanic Gardens.

So, perhaps the council is being ironic when it says food trucks ”are an important part of city life”. They are not even allowed in the city proper. Even more brazenly, the council claims its food-truck policy is all about ”responding to market demand”. It is a market the council is deliberately suppressing.

Still, at least the council pretends it is concerned about what consumers want. The neighbouring City of Yarra does not even bother with such niceties.

Yarra’s mobile food vehicle guidelines state the council’s first priority is to support existing traders in commercial premises. By ”support”, it means ”protect from competition”. Yarra includes some of the best shopping and cultural precincts in Australia. But until last year, food trucks were banned entirely.

Now food trucks are legal – with a permit, of course – but only if they stay at least 100 metres away from any existing takeaway business. Yarra Council can insist the trucks only operate when other restaurants are closed. It can even decide what sort of food is offered for sale.

These restrictions are nothing more than naked, anti-competitive protectionism. They reduce consumer choice. And they stifle Melbourne’s culinary identity.

Seemingly minor rules and regulations can shape a city’s culture in unexpected ways.

Much of what we imagine to be distinctive about global cities is the result of obscure local laws rather than any inherent national character. For instance, Amsterdam’s narrow buildings look that way simply because a tax in the 17th century was levied on the width of buildings. New York’s Times Square is dominated by advertising billboards not because Yankee capitalism is out of control but because the zoning code requires office towers in the square to display illuminated signs.
In Australia, the most obvious example of how regulations transform culture is liquor licensing.

Melbourne and Sydney offer a natural experiment. The people are the same; the laws are different. Until recently, Sydney had extremely expensive liquor licences.

High-priced licences encourage beer barns – licensees need the patronage to recoup the high costs. Melbourne’s much cheaper licences have allowed smaller, more distinctive venues to flourish. The laneway bars that feature in Melbourne’s tourism campaigns only exist because of our distinctive liquor regulations.

In 2008, the NSW government began to offer small-bar licences. But policymakers can’t decide whether to oppose more drinking venues (more places to get drunk) or support them (nicer places to socialise). As a result, Sydney’s small-bar revolution has been less than revolutionary.

In the same way, local councils love the cool vibe of food trucks but they also want to protect restaurants from competition. So they play both sides. The councils brag about their embryonic food-truck culture, while making it as hard as possible for the trucks to actually operate.

This political compromise works well for established restaurants and local government politicians. But it works terribly for us.

Should You Foot The Bill For Execrable Waste Of Human Resources?

Trolls like to say that trolling is an art form. To troll is to be inflammatory on the internet for the sole purpose of disrupting and offending others. It’s more nuanced than it sounds. A troll must be plausible enough to be taken seriously – don’t want to give the game away – but outlandish enough to generate the desired outrage.

Trolling is not always successful.

On a Friday night less than a fortnight ago, six dancers from a company called BalletLab performed an artistic work at the Australian Centre for Contemporary Art at Southbank. This involved them sitting on toilets and taking a dump.

The defecation was done in a most tasteful manner, obviously. The dancers were masked and cloaked in sheer golden garments. The toilets were transparent. Those involved emphasised how brilliant the performance was. The artist proclaimed that bowel movements were ”humanity’s most democratic act”. The centre’s director said it was bold and challenging: “It’s wonderful, powerful work.”

Nonsense. The performance, titled Goldene Bend’er, is a badly executed troll. Nothing more, nothing less.

There’s no longer anything original or particularly provocative about bowel movements presented as art. It has been 52 years since the Italian artist Piero Manzoni canned his own “Artist’s shit” – long enough for it to be considered a classic piece. And toilets featured in art earlier than that. Marcel Duchamp’s urinal is four years shy of its 100th birthday.

These were genuinely important works. Artists have offered up many excrement-related performances, paintings and sculptures since. Remember that infamous painting of the Virgin Mary covered in elephant dung? It’s nearly 20 years old. Poo is a well-covered topic. It’s almost a cliche.

Goldene Bend’er is indulgent and mundane. It reveals that the art world is much more pious and insular than the society it is trying to “challenge”.

Decades ago this sort of stunt would have earned front pages across the country. Politicians would have condemned it. Conservatives would have thundered. Recall how angry people were when the National Gallery bought Jackson Pollock’s Blue Poles in 1973. Recall the fury over Piss Christ in Melbourne in 1997.

But that was long ago. Think about it: six dancers did a poo in front of an audience and the only audible noise was self-congratulation. No outrage. No protests. No one cares. What is the point of shock art if it no longer shocks?

There’s nothing wrong with shock art per se. Ugliness and revulsion has always been a feature of art. Christian painters dwelt on the wounds suffered by Christ on the cross. Death was depicted as twisted skeletons. Grotesque demons and terrifying monsters populated the landscapes of hell. There’s nothing that says art has to be – or has ever been – pleasant.

The 20th century has demonstrated art can be ugly, foul, empty, disgusting, accidental, amateurish, untrained and offensive, and still be art. But surely it at least has to be creative. The only thing worse than being obscene is being boring.

The only reason such faux-radicalism survives is because we are forced to pay for it.

The dance company that performs Goldene Bend’er, BalletLab, is financially supported by the Victorian and Commonwealth governments. The Australian Centre for Contemporary Art gets its money from Victoria and the Commonwealth, too. It also receives another chunk of money from the City of Melbourne.

In its submission to Kevin Rudd’s National Cultural Policy inquiry, the centre wrote that the arts were crying out for “proper investment” – read, much more government funding.

Nonsense. Taxpayer funding protects artists from their audience. That it tends to produce more rubbish than genius is a feature, not a bug. The system is designed to favour indulgent, unpopular work over appealing work.

The first arts grant in Australia was given to a poet, Michael Massey Robinson. In 1818, he was given two cows “for his services as Poet Laureate”.

Robinson knew his market. He would write birthday odes to the King and Queen for Governor Macquarie every year.

Not much has changed. Rather than persuading consumers to pay for their work, artists only have to persuade government bureaucrats to give them a share of tax revenue.

These attempts to shock help drive the public from contemporary art – not because the art is offensive, but because it is trite. It treats the audience as the enemy.

In other words, every taxpayer-funded crap a ballet dancer takes on stage is another blow to the commercial viability of all art.

One of the maxims of the online world is don’t feed the trolls. Let’s not subsidise them either.

Sport And Betting Have Always Been Teammates

Victorian Greens senator Richard Di Natale has drafted a bill to ban betting odds being aired during sports broadcasts.

No, let’s rewrite that. Senator Di Natale has drafted a bill to kick Tom Waterhouse off the television.

Of course, Di Natale’s bill is no more likely to go anywhere than the other few dozen or so bills that have been introduced to the Parliament by minor parties. They are really just written for symbolic purposes.

And appropriately enough, in this case. Banning betting odds during broadcasts is the ultimate symbolic gesture – arbitrary feel-goodism masquerading as social policy.

The backlash against sports betting exposes the flimsy edifice that Australian culture has built around sport. On the one hand, we know sport is a multimillion-dollar corporate business where young and athletic men are split into groups, churned through training regimes, and paid to compete for our amusement. It is a vast money-making ecosystem.

Sport is like Hollywood, but much less risky: investors don’t have to worry about whether the creative types will come up with new and exciting stuff.

This industry is the world of Tom Waterhouse and government subsidies for stadiums and the Australian Crime Commission’s report into sports doping and the $1.2 billion the Seven Network and Foxtel paid for AFL television broadcast rights. It is a world where behaviour standards are written into player employment contracts to ”protect the brand”. People get rich, people get sacked, people get sued. In other words, sport is an industry like any other.

And that is all great. Industries are great. Yet onto this particular industry we impose a web of mythology and fantasy that tries to lift sport above a business to a quasi-religious undertaking. Nobody works themselves into a moral fervour about drug use in investment banking, or in motion pictures. But they do in sports. The sporting world is obsessed with honour and sportsmanship. And purity. It is no coincidence people keep calling for sporting codes to be “cleaned up”, or say a game was played “clean”.

The ideologists of sport proclaim it can bring communities together. In past eras – especially before the violent 20th century – they thought sport could replace warfare. These days, it is mostly about children and vague feelings of social cohesion. The federal government offers funding for a Multicultural Youth Sports Partnership Program. AFL clubs eagerly promote Harmony Day. It’s all very … romantic.

Yes, apparently there are still people who believe sport reduces social tension; people who are able to ignore the decades of violence and nationalistic politics that have swirled around domestic and international sport. And many of these romanticists appear to view the industry of sport with horror.

By now, everybody who is not a first-year arts student has come to terms with the fact that sport involves money. An older debate along these lines – about whether sport should remain amateur or go professional – looks very quaint from the vantage of the 21st century.

Sports betting is just the latest bogyman – yet another threat to that romantic vision. Yet betting on sport is as old as sport itself. One British sports historian, Wray Vamplew, says that much of the strict codification of the rules of sport in the 19th century was driven by the needs of gambling. Early punters found it hard to bet when the rules weren’t codified.

So the sudden panic about odds being broadcast on television is a bit precious – a triumph of the mythology of sport over the reality of sport. It is indicative that most critics of sports betting say they are not worried about the betting so much as seeing the odds on television. They don’t want to break the fantasy. They don’t want to see the revenue streams behind the curtain.

For the hyperbole and hand-wringing, sports betting is a tiny sliver of gambling in Australia.

The Queensland government keeps national gambling statistics. In 2009-10 (the latest year for which comparable figures are available), Australians spent a total of $18.5 billion on all gambling. This number includes everything from racetrack betting to pokies to TattsLotto. They only spent $303 million on sports betting – just over 1.5 per cent of the total.

Yet one academic proclaimed on The Conversation website last week that sports betting represented the steady ”gamblification” of everyday life – Tom Waterhouse is a sign that Australia is being buried by gambling.

The evidence suggests quite the opposite. Total expenditure on gambling has remained steady over the past decade. And if we take population growth into account, then in recent years gambling has begun to decline. Nothing here screams ”impending social problem”.

Instead, the Greens’ Richard Di Natale falls back on an old standard. ”It’s becoming increasingly hard for young kids to know where the sport ends and the gambling begins,” he said in a press release announcing his bill.

That’s the think-of-the-children argument, a favourite of censors, wowsers and reactionaries for two centuries.

It is fine to view sport through a romantic lens. But that lens won’t survive if it requires deliberate ignorance.

Addicted: The Medicalisation Of Bad Behaviour

Our ancestors used religion to ward off the things that scared them. We use medicine. There are few better illustrations of the perverse “medicalisation” of society than the claim that “video game craving is as bad as alcohol”.

We’re taking the human condition (passion, obsession, desire, pleasure) and trying to turn it into a medical condition.

The story is as follows: a PhD candidate at the Australian National University recruited 38 gamers who played an average of 10 to 15 hours of video games a week. Those who reported feelings of withdrawal or cravings to keep playing their favourite game were classed as addicts.

All participants then did a simple test: they were shown a series of differently coloured words and asked to name the colour, not the word, as quickly as they could. Some of the words were related to video games, and with those words the ”addicts” took longer to name the colour than the casual gamers.

The conclusion? Gaming addicts are as consumed by games as alcoholics are consumed by drinking. This is apparently ‘”some of the first scientific evidence that video gaming can be addictive”.

But let’s back up a bit. Ten to 15 hours of gaming a week isn’t very much. The Australian Communications and Media Authority says Australians watch about 20 hours of television a week.

Sometimes we might even suffer negative consequences from this indulgence. (“One more episode of Homeland? It’s already 10.30, but …”) We may get emotionally involved in a show. We might even crave it.

But you could say the same thing about any hobby. And nobody is suggesting the average Australian is addicted to television or fishing or woodwork. At least, not in any meaningful, medical sense.

Addiction is a notoriously slippery concept. In a 2000 study published in the journal Addiction Research, 20 senior addiction experts in the American Psychological Association were asked to define what they meant by the word “addiction”. The answers differed wildly.

Only half the experts could get on board a definition that included “physical dependence”. And that was the closest they came to consensus – except for a general dissatisfaction with the way addiction has come to mean more than dependence on chemical substances.

Yet this is the muddy, vague, uncertain, ill-defined concept that we seem desperate to stamp on every sort of abnormal behaviour. Without any firm foundation, the popular use of the word addiction is creeping into the scientific world.

Excessive shopping? Addiction. Excessive internet use? Addiction.

Yes, people can make a lot of money treating the choices as pathology. There’s always a pill available, or a specialist spruiking their professional services. But we’re as guilty as the medical profession here. The medicalisation of everything is comforting.

First, there’s nothing more appealing than a scientific veneer. If someone has a few too many boozy nights in a row, they don’t go easy for a while, no – they ”detoxify”. All those cultish detox diets offer little more than clean living. But they’re dressed up in pseudo-medical jargon.

Second, if something has a medical cause, it has a medical cure. This is an era of expertise and technological fixes. There is no problem that money and experts cannot fix. In January, a British MP called for the government to pay for the treatment of “those who suffer from internet or gaming addictions”. (But that’s not remotely silly compared with the Swedish heavy metal fan who is on disability support because of his heavy metal addiction.)

Medicalisation comforts because it suggests that our bad decisions are not our fault. Describing self-destructive behaviours as addictions is the ultimate way to shirk individual responsibility. Rather than agents of our own choices, we become passive recipients, preyed on by our surroundings. This is utterly dehumanising. One could ask why we’re so eager to dehumanise ourselves.

Sure, video game addiction looks a lot like a bog-standard moral panic. When someone dies from playing a game 40 hours straight – as a teenager did in Taiwan this year – commentators pontificate about video games, not, say, depression. Every pleasure has to have its dark side.

But society’s fear of addiction – our desperation to turn everything into a medical condition – goes to something deeper. We no longer burn witches; we diagnose them. Either way, we’re still chasing witches.

Critics’ Silence Adds To Walsh’s Cabinet Of Curiosities

David Walsh’s Museum of Old and New Art is an extraordinary achievement.

The art, distributed through an artificial cavern in a Hobart hillside, is surreal and otherworldly. We’ll get to it in a moment. But adding to MONA’s surrealism is the fact that the museum exists entirely outside Australia’s cultural bureaucracy.

MONA does not haggle for support from government budgets, and it is not curated by committee. Cabinet ministers cannot put their friends on the museum’s board.

Private museums are common globally but rare here. Australians expect their major cultural institutions to be wards of state and federal governments.

So the strangeness of the art is amplified when visitors realise they are not in a public building, but are instead guests on the private property of an eccentric billionaire. Before visitors even get to the artwork, they have already been treated to a vision of a different world – one where state and culture are not so perversely intertwined.

Hopefully Walsh’s dispute with the Australian Taxation Office does not put it all at an end. (As an aside, it has been great theatre to see Bob Brown stand up for an accused tax evader.)

All that said – MONA’s very existence being a triumph of private sector culture and free markets and capitalist patronage and so on – we ought to give Walsh the respect of taking his museum seriously. And MONA has some deep problems.

The clue is in the title. There is a broad range of new art in the Museum of Old and New Art. But the curators have a very particular idea of what constitutes “old” art. And their choices betray a strange sort of anti-intellectualism – as if modern art has nothing to do with history, or even its own heritage.

All curation decisions make implicit arguments. And most surveys of modern art are arranged chronologically and thematically. This has the advantage of showing those with only a casual interest the rough logic of modernism and post-modernism: how Paul Cézanne’s landscapes could have led to Jackson Pollock’s splatter paintings, or how Dadist surrealism could have led to Tracey Emin’s soiled bed.

There is, no question, a big distance between the Mona Lisa and Damien Hirst’s shark, but it is a distance Western art has travelled, and the journey was intelligible.

MONA rejects such stodgy determinism, and tosses together art from all eras. The “Theatre of the World” exhibition, which opened in June, is the essence of MONA’s eclecticism. The pieces in this large show span 4,000 years of history. The curators jumble up everything from Pablo Picasso’s Weeping Woman to the vertebrae of a snake. The most striking room has its walls covered in Pacific Island bark-cloths, and features in the centre an Egyptian sarcophagus and one of the stretched human figures of the mid-century sculptor Alberto Giacometti. That is, ancient North Africa, the 19th century Pacific, and 20th century modernism, all in one hall.

But notably absent throughout the exhibition is any significant showing of Western art before 1850. The old art in the Theatre of the World is almost uniformly non-Western.

That observation may seem churlish (there are many outlets for European paintings, Walsh need not provide another one) but this curatorial decision suggests contemporary art arose from nothing; as if modernism and post-modernism exist entirely outside the Western tradition. Yet modern art is the direct heir of classical art.

Yes, many modern artists have been inspired by non-Western art. Picasso spent a lot of time looking at African tribal craft – a point the exhibition makes. But he also used to brag he was the best classically trained draughtsmen of the 20th century. Bluster or not, he and his contemporaries drew upon the artistic heritage of centuries.

When MONA pairs a modern work – more often than not from an artist living and working within the “West” – with a metal mask of a boar from India, the sole point is to disorientate. And the desired reaction is not much more than: art is weird. With this approach, MONA struggles to be more than a cabinet of curiosities.

That’s fine. Walsh does not have to make modern art comprehensible. He is under no statutory obligation to teach. He can alienate his visitors because he, not they, paid for the gallery in the first place.

But, still. One of the arguments made by radical critics of Western art is that it looks at the rest of the world with a patronising eye. In his famous book Orientalism, Edward Said claimed the West “colonised” the East through art before it did so with muskets. Said’s book was highly flawed but highly influential. It launched a thousand PhDs. Said argued European artists infantilised the Orient by imposing on it a sense of weirdness; that everything outside Europe was alien and inscrutable.

That critique has a strange parallel at MONA. The Theatre of the World typically shows a classic modern work – such as the video of an artist who cut a house in half – and contrasts it with, say, a collection of Fijian weapons. Cultural Studies majors would call those weapons representative of “the Other”. They conjure up archaic and condescending ideas about the Noble Savage. And the comparison suggests modern artists are discovering the sort of raw, violent purity that exists only in foreign lands.

So it’s curious to see our cultural critics unwilling to deploy their poison pens against Walsh’s museum. They’re usually proud to be iconoclasts.

But then, since MONA suggests modern art has little to do with the inheritance of Western Civilisation, perhaps it is not that curious.

We’re Bombarded With Swearing But Who #*@%*! Cares?

“I like swearing; I think it’s very healthy,” Ewan McGregor told a celebrity gossip magazine last week. Good for Ewan. He could have added: swearing is so common it’s mundane. It can make you more persuasive. And it’s less offensive now than ever.

No one apparently cared when actor Jean Dujardin yelled “putain!” in his 2012 Oscar acceptance speech. “Putain” literally translates as “whore” but means “f— yeah!”. And remember when the Gillard camp released that video of Kevin Rudd swearing before the federal leadership spill? Nobody could even pretend to be offended. How refreshing. How honest. But really, any other stance would have been rank hypocrisy.

The leading scholarly authority on swearing, US psychologist Timothy Jay, estimated in a 2009 paper “The Utility and Ubiquity of Taboo Words” that the average speaker of English utters around 80 to 90 swear words every day. That’s only about half as frequent as we use first person plural pronouns such as “we” and “us”.

Certainly, the offensiveness of swear words varies. Jay found 10 words dominate. Some of them are gentle: “goddamn” and “sucks”. But the F-word is both the most common and the most extreme in the top 10. So it’s entirely possible the former foreign affairs minister swears less than most people do.

Yet we seem to think people are swearing more often, and more harshly. It isn’t true. There’s no statistical evidence to suggest swearing has increased over the past few decades. Studies of recorded speech demonstrate swearing has remained steady and we’re using the same words we did 30 years ago.

But swearing is more public, more frequent in film, television, on radio and in print. It’s been normalised. The prevalence of swearing hasn’t changed, but its cultural status has.

The result, as a New South Wales magistrate noted in a ruling in 2002, is that the F-word “has lost much of its punch”.

We don’t blink at French Connection UK’s acronym “FCUK”. The name of the new snack “Nuckin Futs”, approved by Australia’s trademarks examiner in January, is playful rather than obscene. If profanity can sell nibbles and knitwear, can it be considered profane at all?

This is all surely a good thing. More swearing doesn’t mean society is becoming less polite.

One can be deeply racist or sexist or homophobic without swearing. On the other hand, we have all met friendly and well-intentioned people who pepper their speech with profanity. The former (racism, sexism) has become rightly unacceptable, and the latter is becoming innocuous. This is great. Any moral compass that treats mere words on par with malicious intentions is a badly calibrated one. That’s why the N-word is now much more offensive than the F-word – it indicates racist intentions.

Traditionally, swearing has also been governed by a double standard: men would curse freely among other men but bite their tongue around women out of patronising respect. Gender equality has eroded that anachronism.

Nor does the “think of the children” mindset offer any clear restraint on profanity.
As Ewan McGregor said: “I like hearing my kids swear, and I’ll pretend they’re not allowed to … but actually I think it’s quite funny.”

McGregor shouldn’t bother pretending. Jay points to findings that parental sanctions have no effect on how much a child swears when they reach adulthood. The scholarly evidence tells us children learn rude words from kids, not adults.

Last year, research psychologists established that swearing can help with pain relief. A 2006 study published in the journal Social Influence even found swearing “significantly increased” the persuasiveness of an argument. As the authors wrote, “the use of obscenity could make a credible speaker appear more human”.

When the Baillieu government introduced on-the-spot fines for swearing in June last year, there was an understandable outcry. Almost everybody swears, and swears a lot. Punishing extremely common language is obviously a bad idea. Something so banal should not be a police matter. Even prime ministers do it, after all.