Apparently, complaining in public worked for Tim Schafer’s Psychonauts:
"We are working on Psychonauts, I promise.
Later,
Alan Stuart
Emulation Ninja
Xbox 360 Team"
Or maybe it’s just pure coincidence.
Apparently, complaining in public worked for Tim Schafer’s Psychonauts:
"We are working on Psychonauts, I promise.
Later,
Alan Stuart
Emulation Ninja
Xbox 360 Team"
Or maybe it’s just pure coincidence.
I ran into this paper which addresses the topic of software reliability from a (to me) very surprising point of view. In a nutshell, it defends the notion that our computing model (Turing’s) ties together software complexity and unreliability, but there are fundamentally different approaches where increased complexity actually increases reliability as well. Integrated circuits and our very own brain use such approaches.
It is true that Alan Turing and Fred Brooks are among the worshipped gods of modern software engineering. Being a deeply convinced agnostic and wanabee atheist, I jumped at the opportunity to read about ways to question and (possibly) get rid of our gods.
Aside from a few remarkably funny sentences, like the politically incorrect "all connectors should be unidirectional, i.e., they should be either male (sender) or female (receiver)," the article is intriguing. The ideas are related to neural networks, self-adjusting systems, and I can even identify a touch of functional / declarative concepts in there; it’s a shame that the article never explores these parallelisms.
Now, my two main problems with the article are:
– I’m not sure that integrated circuits are so fundamentally complex. Size is not the same as complexity, and the fact that modern CPUs have hundreds of millions of elements does not automatically make them more complex.
– The brain may be seen as reliable in the sense that it is able to perform fairly well in very obscure, ambiguous and imprecise circumstances. However, it’s output is equally imprecise, and worse, unpredictable. No two people are alike and all that. But predictability and precision are among the most common traits we require of our software systems.
Turing computing philosophy typically addresses the problems of reliability and faulty software by brute force: strict specification and domain limitations, development protocols and verification procedures. They are not without a price: efficiency and speed of development greatly increase. It’s common for fault-safe developers to produce functionality at a tenth (or worse) of the speed of their less strict counterparts. Software Engineering techniques such as Object Orientation or Test-Driven Development try to provide helpful improvements (rather than full-fledged solutions) in software quality without incurring in great costs. The old 90/10 rule again.
It is unclear if Savain’s approach can produce a real, working, and more importantly, practical development environment and process. It is even less clear that this model can be applied equally well in all areas of computing, or it would end up restricted to specific domains like pattern recognition and creative problem-solving (which is where the brain excels).
But it’s always a good idea to consider alternatives and think outside the box. Especially on a rainy day, and it’s pouring outside my window.
Edit: Reading this description of the synchronous, signal-based Esterel language should clarify some of the concepts and approaches described by Savain.
I only spent one year working at High Voltage Software in Chicago, but I have many fond memories of my time there. One of these was working with Eric Nofsinger, who is featured in this Gamasutra feature.While he may sound fairly corporate in his responses, he’s as driven and talented as they come. He describes some of the joys and woes of being an independent developer, working with licensed material, and striving to create original IP.
Edit: Corrected the link, I had copied Something Awful’s Photoshop Phriday instead. 🙂
Well, not for a couple years but still quite a bombshell. The interesting part is, pretty much every comment seems to agree in lamenting that it’s not Steve Ballmer going away instead.
Hm, too many Microsoft-related posts lately, I gotta start looking elsewhere.
This is too funny. If you are a fan of Psychonauts, you might want to send Microsoft an email expressing your preferences for backwards compatibility support on the 360, but I suggest you get a bit more creative than a straight cut & paste of Tim Schaeffer’s text. 🙂
Rumour has it that Criterion’s Burnout team offered their support to make their XBox game work on the emulator, and ended up releasing a native (and awesome) version instead.
I for one keep rooting for Soul Calibur II to make it in the list. My brother Juan Carlos is one of the fabled "emulation ninjas" and he loves that game, so don’t be surprised if your own preferences are not heeded as quickly as you’d wish. In the end, support is based on market value of the title, or the sheer luck of the game actually working fine without any effort on the backcompat team.
But it never hurts to voice your opinions.
Programmers and programming teams should strive to use a reasonably well defined and consistent set of coding standards in their projects. These typically include rules for notation (use of case and special characters in identifiers, etc), style (layout of language elements within the source code), and idioms (specific usage of language features).
The benefits of a clear and useful coding standard are twofold: make the code easy to read and understand even in the absence of code inspection tools or previous knowledge, and encourage the creation of code whose performance and functionality are safe, predictable, verifiable and maintainable.
Programmers tend to be fairly vocal and strict about defending their preferred set of coding standards (whether they are equally strict at adhering to these standards varies a lot). This often leads to endless religious wars where opinions and personal experiences are presented as facts or universally applicable ideas. But, if widely varying sets of standards have proven to work in different projects, there can’t be a universally "best" standard?
Of course there isn’t. Duh!
There are bad practices, good practices and best practices. Bad practices (for example, lack of error checking) can be considered to produce universally undesirable results, whereas best practices (for example, strict const-correctness) should yield better code regardless of the circumstances.
The thing is, most decent standards offer mostly a collection of good practices. The value and applicability of these is relative to the context in which they are used, and as such can’t be considered universally valid or invalid. Different projects and different programmers have varying requirements and levels of expertise. Programmers naturally tend to restrict and partition any problem space into manageable subsets. Divide and conquer, so to speak. However, the huge scope of what defines the context of a programming task, often leads to implicit and biased assumptions about what is practical, what is relevant, and what is plain habit.
Being exposed, using and adapting to different standards is, in my experience, one of the best ways to gain insight into what defines the context of a project, and what works best in different situations.
John Lakos’ "Large Scale C++ Software Design" is one of the classic books touching this topic, but there are few freely available online references. I was happy to come across this coding standards document in Bjarne Stroustrup’s homepage. It is a very long but enjoyable exploration of a C++ coding standard that is widely different to the one I regularly advocate and use. Every goal, concept and rule (and there are many) is described and justified in a very concise manner. All of them make a lot of sense.
Being safety-critical is an often desired quality, but rarely a top priority of games software. That is probably a good place to start when comparing that document with your own context and coding standard. Enjoy the ride!
This new spanish webmag just published their first issue, which includes an article by yours truly. Best of lucks to Bor and all the crew!
A History of Violence and Lord of War. It’s a great time to be a movie lover.
Over the past few years, I have devoted short chunks of time to reflect on what is my personal ideology. I suppose age is a natural motivator for this sort of process, but there have been other fuels burning that fire. Most notable among them is being accused of ideological relativism; of offering explanations to phenomena rather than solutions to problems; of flagging any subject on which I don’t have a strong position as being too unfocused to be worthy of an position; of using tolerance as an excuse for avoiding a clear standing on subjects in which I am actually not confident, or more to the point, knowledgeable.
You probably visit this site due to an interest in gaming, programming and "modern" entertainment. I hope you enjoy the few contributions that I can make in the form of source code, commentary and links to interesting articles. I won’t bore you with details of my personal reflection on religion, politics, morality and the coveted meaning of life. But this article by Dan Simmons (of "Hyperion" fame) was one of the best essays I’ve read in a long time. It addresses most of the issues going on in my head about the practical conflicts raised by immigration and tolerant multiculturalism.
If anything, it reaffirmed my relatively recent conviction that preserving the western model of human rights is more important than preserving religious, cultural and national identities and qualities. How you achieve that, and who uses that as an excuse for fulfilling less humanitarian agendas, are completely different topics.
The MiniMicrosoft blog has been a very popular place for Microsofties to anonymously bring up and discuss the issues that plague their company. For those of us not in the loop, the blog might be just a PR stunt by the company’s HR department, or maybe it’s exactly what it says and has somehow managed to avoid being bombed by Microsoft’s policy-enforcement teams. I don’t know. But they are pounding their chest a lot on the recent announcements by MS executives regarding certain HR practices.
If the blog truly made a difference, kudos to the author and participants for managing an incredible feat. I have no idea about the actual scope and implications of these changes, but the idea that a company as big as Microsoft can change course in its corporate practices and culture, and that employees can affect that process, is heartwarming and fills me with optimism.
The criticisms found on that site are often harsh, but always well presented and often lack the more common cynical approach of the classic "disgruntled employee". I’m sure that played a big part in any improvements they have managed to contribute to.
Buried in the comments was a link to this interesting article about talent and corporate culture. One nugget: "forty per cent of those students who were praised for their intelligence lied about how they had scored on the test, adjusting their grade upward. They weren’t naturally deceptive people, and they weren’t any less intelligent or self-confident than anyone else. They simply did what people do when they are immersed in an environment that celebrates them solely for their innate "talent." They begin to define themselves by that description, and when times get tough and that self-image is threatened they have difficulty with the consequences." There’s a few scary comments about how people can be blinded by some kind of idealized "talent" and disregard the actual performance and practical application of that talent. An excellent read. Related articles also at Joel On Software: here and here.
Social Widgets powered by AB-WebLog.com.