This is Part 1 of a 3 part article. Part 1 deals with some
background on gaming as an industry and it's main conference, E3
If you're not already in the know, E3,
a huge conference that covers most anything that is video game related just
passed. If you're just finding out from this post, you missed it. You probably
don't love video games, but that's ok, I'm not judging. What you missed was a
huge expo of the games to come this year, with each of the major players in the
console arena (Sony - Playstation
4, Microsoft - Xbox One
Nintendo - Wii...er Wii U)
putting on the best show they could to excite consumers for the next 12 months
or so. This is important for a few reasons. These companies show tech demos and
game demos to entice both consumers and game and software developers alike to
their respective camps to endorse and invest in their respective hardware. Gaming is now a multi-billion dollar industry, so this is an important decision.
This is never more important than during years like last year, just
prior to the launch of a new console generation, when these companies are then
on the hook to show the world why they should invest in their specific console
ecosystem, because though the same games may come out for multiple platforms,
they are generally not compatible with one another, in case you aren't
familiar. A new console generation means new technology, new gaming IPs, new
software languages to learn, new bugs to deal with, and increased development
time, to name a few things. It means, for many developers, often, buying in on
a particular console to focus on so they can put out the highest quality game
with as few compromises as possible.
For consumers, many of which cannot afford one, let alone three
consoles to invest in at launch (at least at first) it means deciding which
console to save their pennies and begin investing in both hardware and
software. Peripherals, such as controllers, headsets and cameras (I could write
another article on this one alone) will likely not be included beyond the basic
amount, and are added costs. Buying in to a new console can easily cost the
better part of a thousand dollars if one buys games and peripherals, as is the
case with dedicated gamers, which I normally count myself amongst.
This means that each company must reflect long and hard about how
they approach these launches. This may mean changes in policy for how the games
will be distributed. I don't need to tell you that we are living increasingly
in a digital age, one where the physical world is becoming... well, redundant.
This is evidenced by the evolution of gaming console's media. Where one was
once restricted to the amount of cartridges or physical media such as discs one
could afford (and possibly carry), gaming consoles of today do not rely
strictly on physical media to play the software (media and/or games) they need
to be useful. Increasingly, the software can either be saved onto the console
itself from the physical media, and currently it is even all available for
download online, so there is no actual need for physical media whatsoever.
So what?
What does that actually change? Games are games, right? Films are
films, and music is music. If you buy it, you buy it and should be able to sell
it again, n'est-ce pas? Not necessarily. At the 2013 E3 conference, Microsoft
stirred up quite a controversy by changing their policy on physical copies of
used games. They even went as far as to add new sharing features that
they claimed were to benefit the users in ways that were only made possible by
the new changes. Were they being disingenuous? Licensing games instead of
selling them outright? It sounds like a step backwards, doesn't it?
Check back soon for Part 2 and Part 3 to get a better picture of the
relevance of these changes and how they may or may not change the way we view
ownership of media, specifically digital content.