This story was written by a human, but it's possible that other articles you've read today — from Wikipedia entries to news briefs — were written by computers.

Computer software programs known as bots are creating an increasing amount of content, raising questions about authorship and how much value such content has.

As algorithms improve, more and more content will be produced by computers, and at least one man believes a computer could even win a Pulitzer in the next 20 years.

Writing vs. compiling

Phil Parker and Sverker Johansson are two of the most prolific authors in the world.

Parker has published 200,000 books with such riveting titles as "The 2007-2012 Outlook for Tufted Washable Scatter Rugs, Bathmats and Sets That Measure 6-Feet by 9-Feet or Smaller in India."

Using computer algorithms that collect public information, Parker is able to "write" an entire book in under an hour.

Johansson, on the other hand, is responsible for 2.7 million Wikipedia articles, or 8.5 percent of the collaborative Internet encyclopedia.

In addition to writing numerous articles on Filipino cities, the 53-year-old Swede — who has degrees in linguistics, civil engineering, economics and particle physics — has created entries for every known bird and fungus species.

However, Wikipedia purists criticize Johansson's methods because the majority of his entries were compiled by a bot named Lsjbot.

Lsjbot collects information from databases and other digital sources and can create up to 10,000 new entries a day.

Many of these articles are stubs, meaning they contain only basic information, such as a city entry that simply lists the city's coordinates and population.

Johansson admits that Lsjbot's entries can be bare-boned and boring, but he says creating such stub articles is useful because writers who know more about a topic can fill in additional information later.

His critics argue that such stub articles contain so little information that they're hardly doing a service to readers, and even Lsjbot's longer entries are subpar because they lack the creativity a human author would bring.

But Johansson says Lsjbot-created articles combat a problem that's inherent in a collaborative encyclopedia: People tend to write about topics that interest them, so there's a wealth of information on some topics and very little on others.

As an example, he points out that on the Swedish Wikipedia there are more than 150 articles on "Lord of the Rings" characters and fewer than 10 on important figures in the Vietnam War.

"I have nothing against Tolkien and I am also more familiar with the battle against Sauron than the Tet Offensive, but is this really a well-balanced encyclopedia?" he told the Wall Street Journal.

Bots get bylines

Bots aren't new to Wikipedia. In fact, they've been part of the encyclopedia's development almost since its beginning.

Within a year of its 2001 founding, a bot known as "rambot" created 30,000 Wikipedia articles on U.S. towns.

Using Census data, the bot generated short entries for each town that contained little more than demographic statistics. Once created, human editors stepped in to flesh out these entries.

In 2008, another bot created thousands of stub articles on asteroids, using data from a NASA database.

Wikipedia also uses hundreds of bots to monitor and edit entries, such as ClueBot NG, which can detect and delete vandalism in seconds. (Here's an example of just how fast Wikipedia's bots can correct such an occurrence.)

But bots aren't simply generating and moderating thousands of Wikipedia articles — they're also writing news stories that appear in publications like the Los Angeles Times.

Times reporter Ken Schwencke created Quakebot, an algorithm that automatically writes and publishes a story on the newspaper's website every time an earthquake is detected in California.

Within three minutes of a July 8 tremor, Quakebot had published a short story detailing the event.

But could bots write the journalist right out of journalism?

Yes, and no.

Kristian Hammond, co-founder of Narrative Science — a company that designs computers to write news stories — believes that more than 90 percent of news will be written by computers by 2027.

But instead of computers and algorithms replacing human reporters, Hammond thinks news writing will simply expand.

While bots mine data and churn out stories on events that reporters aren't covering, journalists will still write with greater depth and more creativity.

As Will Oremus writes in Slate, there are some things humans can do that bots can't.

"We're good at telling stories. We're good at picking out interesting anecdotes and drawing analogies and connections. And we have an intuitive sense of what our fellow humans will find relevant and interesting. None of these qualities come naturally to machines."

Related on MNN: