Archive for the ‘Philosophy’ Category

If people’s beliefs strengthen in the face of argument, then how do you reassess beliefs to arrive at the good ones?  Epistemic Nihilism.

Destroy everything in your head, attacking from all angles with all other beliefs.  Throw beliefs at each other, they your soldiers and you their general.  Marshal them to war and pit them mercilessly and endlessly at odds.  Those nations of beliefs which stand, those individuals which triumph, they should be your initial guide to the beliefs which stand above the others.

And never forget to send in the cowards and deserters which hide amongst the shadows of your mind.  They too must be tested.


Read Full Post »

Go Here and Read. (H/T Reddit)

Read Full Post »

Steve Waldman of interfluidity writes stuff in response to Tyler Cowen writing stuff.  I can recommend reading interfluidity without reservation.  Go, do so.  Come back in a few days.  I might even have a new post (in line with my current schedule of “a day or so after Lum posts).

One bit I wanted to pull attention to:

For the most part, Tyrone pointed out, technological progress, is labor displacing. It simultaneously creates valuable new techniques for reconfiguring real resources while diminishing the number of people who are required to participate in those transformations, and who can therefore trade their participation for spending power. There is a myth among neoliberal economists that labor markets have always “adjusted” sua sponte: that when laborers were displaced from farms, “higher value” factories arose to employ them; that when the factories were downsized and offshored, a more pleasant, higher-value service economy came to be; etc. That narrative is wrong, he told me. At best it is criminally incomplete. With each technological change, new social institutions had to arise to sustain dispersed purchasing power despite a reduction of numbers and bargaining power of workers in old industries. Displaced workers ultimately did find new work, but only because the new social institutions “artificially” created buyers for all the things displaced workers reinvented themselves to sell. Without this institutional innovation, Tyrone tells me, something like the Great Depression would have been the new normal. Historically, institutions that have arisen to sustain purchasing power despite increasingly labor-efficient core production include direct government transfers and expenditures, labor unions, monetary policy interventions, financial bubbles and financial fraud.

Basically, the standard cybernetics argument!  This single insight into techonogical progress is the foundation for damn near every scifi novel ever, particularly any that touches on the social situation of their posited future.

Dystopian scifi focuses on the inevitability of the masses losing out.  In that case, in a capitalist society, the owner of the means of production (which roughly means the owner of the implementations of technology) reap the benefits of technological advancement, while labor is relegated to cogs in a machine that increasingly has little use for them, until they become completely displaced, totally irrelevant, and utterly incapable of finding a handhold in the increasingly distant, technologically advanced society.

While the lot in life of the displaced masses tends to improve in ways, that advancement certainly doesn’t seem to have done much for their lives relative to those of medieval peasants of Europe and Russia.  They still live in relative poverty, at the whims of property holders who really don’t give a shit about them.

Utopian scifi instead focuses on the freedom engendered by such technological transformation.  It is a good thing that labor isn’t needed: that labor is freed to do things that require a more human touch.  Surely if a robot can do it, it probably isn’t worth wasting a human’s time on it (assuming non-sentient robots or whatever; I don’t want to get into discussions about the ‘other-chassis-ed’).  Those free humans now have opportunities to go do something else!  And since stuff is cheaper, purcahsers now have more money to waste on what would have been considered useless crap, but now gains potentially new value.

For instance, this blog is a prime example of what Utopian scifi authors consider a win for technology: a complete waste of time for both you and me were we both living in grinding poverty, but with the leisure engendered by new technologies (because now we don’t have to waste time worrying about food and shelter), now suddenly my asinine commentary on philosophy, games, and economics can be viewed as a new value! (think Pyramid of Needs here).

What I find interesting is that “progress” must be viewed through a variety of lenses.  Dystopians have a pretty solid reason for arguing that new tech screws most everyone over.  Utopians have good arguments for saying the exact opposite.  And I say they’re both right, because it entirely depends on both your point of view on what’s better and on the frame in which the technology occurs.

But ignore me.  Go read Waldman.

Read Full Post »

This article from the BBC discusses a study which correlates increased living space with increased evolution.  More particularly, the ability of a mutation to thrive is directly correlated to the ease with which it can enter an evolutionary niche.  The article ends with a critical quote from another evolutionary biologist:

And in general, what is the impetus to occupy new portions of ecological space if not to avoid competition with the species in the space already occupied?

The problem with this question is it’s subtly misinterpreting the study results.  First, generally, evolution is success by accident, so there’s no “impetus” involved.  But more importantly, the study is pointing out not that evolution does not creatures into additional ecological spaces, but rather that having found ecological spaces, things evolve.

Survival, once a free ecological niche is found, is assured for some time.  Take the evolution of birds.  Once they evolved wings, why evolve further?  There’s no reason to, no evolutionary pressure; especially initially, when individuals can just move into open space within the ecological niche they already inhabit.  However, there is ample room for new species to evolve without any threat.  Competition does not provoke proliferation through specialization necessarily.  Instead, it could as easily reduce the ability for new trials to succeed and thrive, since they would be rapidly killed by ruthless competitors before they could find their niche.

When success is assured and survival simple, though, new traits have a much easier time developing without significantly harming the survival opportunities of the infant species.

The impetus to occupy a new niche, then (if we can speak of such a thing), is not to avoid competition.  The animal moving into the niche is likely the descendant of a successful survivor and competitor.  It has no particular need to expand.  Rather, the impetus is more likely why not expand?

Read Full Post »

So sayeth the guy who talks about movies a lot: never can video games aspire to have, amidst their mighty pantheon of wondrous achievements, a single, solitary instance of a “Work of Art”.

I disagree, and I think it comes down to definitions here.  Apparently, I have a broader definition of art than Ebert…or perhaps we approach it in two very different directions, leading to radically different final definitions that end up with some overlap.  In reading through his argument, I’m really unable to discern precisely what his definition of art IS.

Of course, the basis of his discourse is a rebuttal of a rather poorly made argument, by a woman who undermined her point at the very beginning of her talk by agreeing with Ebert that “No one in or out of the field has ever been able to cite a game worthy of comparison with the great poets, filmmakers, novelists and poets.”  This isn’t a very persuasive rhetorical maneuver, nor is it true.  I emphatically can name games worthy of such comparison, and know people who would name other such games.

She then continues down the strictly shoddy rhetorical path of continually throwing up obstacles to her argument.  Games which have been with us for centuries or millenia are not art.  This actually isn’t too controversial, which is why I think she so easily agrees: how is baseball art?  Football?  Mahjong?  We don’t really call these art normally, and since we’ve never granted them the positive assertion, then the negation must apply.  If something isn’t art, it’s “Not Art”.

I think this is partly a problem of our linguistic structure, where we have the exact same syntax for denying a property to something and saying we don’t know if it has that property.  If we never say something is art…that could mean, you know, that we’ve just never gone about thinking about it.  But Kellee Santiago simply goes on with the uncontested assertion that none of these games are art, appealing directly to common usage.  Never does she ask why they aren’t art.

Instead, she hops over to wikipedia to try and define art, arriving at this:

Art is the process or product of deliberately arranging elements in a way to affect the senses or emotions.

Of course, the article has a giant label over it, pointing out potential issues with it.  This definition, for instance, has no corroborating citations.  Wikipedia is built strictly on summarizing multiple sources in an effort to distill common, accessible knowledge, of which definitions are a part.  But let’s work with this, since it’s going to, at least initially, be Santiago’s starting place.  It’s worthwhile to point out that further down, wikipedia delves into the substantial argument raging with the philosophy of aesthetics (a particular philosophical endeavor I’ve always found cringeworthy, anyway), quickly diluting the clarity of its original definition.  Further, as Ebert notes, chess may fit that definition…as might football, baseball, mahjong, and other non-video-games.

Part of the issue here is that beauty, passion, the “Dionysian”, as Nietzsche called it, is somehow missing.  Take Kellee’s final definition: “Art is a way of communicating ideas to an audience in a way the audience finds engaging”.  What, precisely, is the idea communicated by music?  It doesn’t seem to be anything specific.  Bach’s Toccata and Fugue in D Minor, something I tend to consider a staggering work of art, seems to offer no specific concepts or conceptual arrangements.  Pachelbel’s Canon in D is another wonderful artistic accomplishment, and the idea there seems to be…the generally accurate application of the rules of the canon, rules which are quite precise.  Indeed, the act of creating a canon, as Bach was famed for, was very much like…a game.  Simply look at the puzzle canon.

In fact, it feels like she just pushed the argument off.  Anything can be said to communicate an idea, as, strictly speaking, the moment a human comes in contact with something external, an idea is formed to attempt to mentally model this something.  Really, the thing differentiating art seems to be “engagement”, which remains poorly defined.

See, there’s little point in continuing.  We can’t use the examples she gives as a means of inductively arriving at a general definition of art, because she’s trying to convince they are instances of art.  We’d need to know they were first, and then try and move from there.  So really, she’s trying to claim her definition is good, and show how games are getting close to whatever her definition is.

It’s no good; her argument is, as Ebert properly notes, not terribly cohesive and difficult to see from the examples given.  She already conceded the ground that these examples aren’t art…they’re apparently the chicken scratches of early cave painters.  I agree with Ebert in his response to this:

They were great artists at that time, geniuses with nothing to build on, and were not in the process of becoming Michelangelo or anyone else.

Art, I think, has to be considered an end in itself.  A work of art is self-contained.

But then, what is art?  Ebert doesn’t seem to have a solid definition either, nor does he respond to the “engagement” definition.  He does point to a single difference, as he sees it, between games and art.

One obvious difference between art and games is that you can win a game. It has rules, points, objectives, and an outcome.

This is interesting, I think, because it seems to miss the point.  It is not that games have “rules, points, objectives, and an outcome”.  That’s like saying a human body has hands feet, torso, chest and legs.  A game IS those things.  He’s correct, if you don’t have rules and goals, you don’t have a game.  You can win at a single instance of a game; you cannot “win” a game.  The game sits seperately from our playthroughs, giving weight to them, providing the bones to hold up the meat of our interaction.

A game, in its essence, consists entirely of a conglomeration of concepts, of abstract rules governing a “world”, a set of particles upon which those rules operate.  If ever there were a medium which sought to express ideas, games, in all forms, is it.  Games are representations of structures of motion, whimsical, semi-real, relevant, or absolutely divorced from any connection to the rules of our own world.  All of the narratives constructs, the stories, the setting, the graphics, the music, the controls, the design, all of these are woven together into the higher form of a game.  They are subordinate to the whole, which is experienced as discovery.

Consider a Monet.  Impressionism is not, at least to me, terribly interesting initially.  However, upon finally seeing a monet, in person, from a distance, I discovered what was being shown: not simply the static image of a pond, or a snowed-upon roof, but the distillation of the motion.  Seen from afar, they managed to convey a more dynamic, visceral representation of the subject matter than a more statically detailed rendition.  That’s not to say Impressionism therefore was better art…it simply has its own virtues.

I am left to think that it is something of this sort which makes something potentially art: it’s ability to evoke passion in us.  Art is something towards which we cannot be indifferent, or at least which was cared for in its making.  Passion is the true tool of the artist, and all the myriad forms of “art” are simply new ways to express the passion of the artist.  Perhaps I am simply troubled by the lack of concern for the artist, for we focus on the audience.  Does an artist even think of the audience?  Is an audience necessary to art?  Or is art the result of the vain attempt to give form to what is strictly a concept, to birth into reality something hinted at in a mind?  If it is this latter, isn’t the impact on the audience simply a secondary concern?

Really, part of my disagreement here is that I find logical proofs to be artistic.  Indeed, they cannot fail to meet any definition provided by either Ebert or Santiago, yet I bet neither of them would wish to call logical proofs…art.  There is, however, no other word I can find to apply to the deft mental construction, the elegance and awe-inspiring brilliance of things like Cantor’s Diagonalization Argument or Godel’s Incompleteness Theorems.  If math and logic can produce works of art, I find myself open to discovering works of art anywhere.

Thus, I disagree with Roger Ebert and Kellee Santiago that games have yet to produce a masterpiece of the form.  I am quite willing to assert the Silent Hill 2 ranks as an absolute masterpiece of a game, on par with any of the masterpieces of psychological thriller I have seen and certainly equal to any of the great novels I have read.  I put down super mario brothers as an example of distilled fun, and put forward that watching what truly great players can do with the game says that the game itself is a work of art.  I would happily contend that Modern Warfare 1 (but not 2) was an absolute masterpiece of first person shooting and single-player gaming.  I think Final Fantasy VI and VII are both worthy of being called works of art, along with Chrono Trigger.  These games were not merely pivotal points in games history.

I shouldn’t fail to include Braid, though I sadly haven’t played it.  I would happily call Audiosurf a successful piece of experimental art.  Geometry Wars Evolved also qualifies as a work of art.  For me, Extreme G 2 delivered to me an experience which was evocative and brilliant (I couldn’t quite call Extreme G 3 better, and I didn’t play Wipeout XL, so I can’t comment on it).  Many Bioware games could arguably fit within this pantheon of masterpieces of art, along with many of Blizzard’s works.

Tetris is a masterpiece.

These are things I would put in a gallery, if a gallery could be a place where one person could sit, ensconced in the setting envisioned for these games for the time it takes to squeeze their wonder from them, and call them worthy of display.  I do not think games are lacking for works of art.  I think the world of video games overflows with some of the most profound and earnest creative effort humanity has ever born witness to, whether crass entertainment or high-minded morals.  Heck, the world SURROUNDING games gives birth to a massive creative effort, from the fansites to the webcomics to the theorycrafting to the remixed music.

In fact, I defy Santiago or Ebert to describe a single artistic subculture which has ever promoted such a diversity of creative effort.

Read Full Post »

How people view and relate to the concept “religion” fascinates me, as was probably obvious from that discussion I had about Science as Religion with Jormundgard back in December.  Kotaku is apparently taking this week to discuss religion and video games, and has kicked it off with an article by Owen Good on religious depictions in video games, and why it is so infrequent to see real religions actually explored in games.

Now, I’m not terribly concerned with the reasoning and conclusions of the article itself; rather, I’m interested in the way the article discusses religion.  Indeed, the comments following the article are even more intriguing, as they illuminate the basic contexts of the respondents and how they view religion.  Most revealing is not what is said, but what is assumed and what is not said.

Read Full Post »

I still remember the spirited debate I had here with Jormundgard about Science as Religion.  I’ve been rifling around for ways to properly restart that conversation, but my mental inertia fizzled, the train of my thought changing to different tracks.  Despite that, I still peer out the windows, longingly gazing upon the mountains of that far country and the discoveries to be made upon their sides, amongst their foothills and valleys, and the vistas upon their peaks.

And I remember that conversation was begun talking about climate change and skepticism, sparked by talk of the Climategate Emails.  It was with this rolling about in my head that I read this interview with James Lovelock.  Here’s, for my money, the choicest quote:

What I like about sceptics is that in good science you need critics that make you think: “Crumbs, have I made a mistake here?” If you don’t have that continuously, you really are up the creek.

I’m relatively certain that this feeling is shared amongst all scientists, amateur and professional.  Indeed, on the grounds of the Church of Reason, amidst the cathedrals of the university, I’m certain it’s felt they are all skeptics and critics, for isn’t that fundamental to science?  The asking of questions?  Isn’t that the nature of Reason?

It is not, however.  Consider that the Logos as pursued by the disciples of Socrates (Plato and his student, Aristotle) moved along two paths: Dialectic and Logic.  Dialectic is the pursuit of knowledge through discourse, along the lines of questions and answers.  It is the very essence of skepticism; Logic, though, is not.  Aristotelian Logic, rather, is the exploration of existing axioms to produce theorems, the development and exploration of hierarchies, the refinement of models.  To be a skeptic is to question the assertions of a model, which is not the purpose of analytic logic.  Analytic logic is strictly concerned with the development of the model itself, not with its truth (or applicability or whatever).

That isn’t to say analysis is bad or unable to ascertain truth, but rather that it cannot question itself because it assumes its own accuracy.  Consider a standard logical formulation:

  1. p: Axiom 1
  2. p->q: Axiom 2
  3. q: from 1 & 2

We haven’t honestly said anything here beyond “given 1 and 2, 3”.  We haven’t demonstrated 1 and 2, so we haven’t demonstrated 3.  For any more complex demonstration, the set of assumptions grows, and over all the cloud hangs that the particular system of logic itself is an assumption.  A model of truth, not truth itself, the wall we touch in the dark and assume indicates a building is there.

Most science proceeds in the direction of examining and explaining the results of a series of assumptions.  Realize that measurements are taken assuming certain contexts hold true; it is very rare a measurement in such circumstance can contradict the assumptions behind its importance.  Scientists trod well-worn paths, waggling their beards too and fro as they look for changes to the grass lining their roads, or stopping to set a loose cobble firm.  They are not given to striking out perpendicular to the given path, as the important bit is the destination at the paths end: they already know where they are going.  They might stop to make the passage easier for those who follow, but new trails are not blazed by people who know where they are going.

Read Full Post »

Older Posts »