C# rapid game development

I’ve been working on a Direct 3d game in C++ for quite some while now. I recently tried using Direct 3d in C# and it took me only 15 minutes to get a basic framework running instead of the 5 hours in C++. Now I know that C# isn’t favoured for game developing for it is slower; it is rapid development and a lot more maintainable.

I did some test on how much the difference would be between having a C# or C++ game engine.

The results were quite supprising:

  • C# is slower, but when letting as much as possible be done by the graphics card and direct X instead of C# code the difference between C# and C++ is neglected also due to superior threading control and runtime memory managment of C#
  • C# develops a lot faster than C++, far faster than I expected initialy
  • Threading and timing the renders and other operations is a lot faster in C# than it is in C++. Also multithreading doesn’t generate as much problems in C# as it does in C++. And maybe even important; debugging threads in C# is a lot easier than in C++
  • The second critical area (after the rendering) is the gameplay engine. Most games need to use scripts for it is unfeasable to hardcode everything. C# doesn’t has to use scripts for using seperate .net dll’s works quite well and is hundreds times faster too

I wonder what will be the first major game that will be written in .net. It certainly should atract some attention by game developers.

Server transfer: downtime

I will soon own a quite cheap virtual server (by greenT, great company) which will give me way more flexibility than the current shared server account on which w-nz.com is located.

I’ll transfer my current site to the new virtual server from the current shared server account (w-nz.com; xr12.com; intrepidsoft.net).

In the worst case this could result in domain problems, transfer problems and lack of time to get it done and therefore a lot of downtime.

So when w-nz.com is down please be patient and if you want to email me use bas.westerbaan@gmail.com instead of my @w-nz.com email addresses.

Negative .Net myths busted

There are a lot of negative myths about .net which people tend to use to favor the traditional languages like C++ above .net. I’ve busted the ones I read frequently:

  • The GC is really slow
    malloc is way slower! The Garbage Collect of .net actually is faster than any Unmanaged code for it nows whether a value is a reference (pointer) and therefore can move objects in the memory. The GC puts objects of about the same age (generation) close to eachother in the memory. Objects tend to refer and use objects in the same generation. The processor itself doesn’t directly load a value from the memory but loads a whole block of a few KiloBytes in the Cache. When the processor directly caches all the objects which one object uses it just runs a lot faster for working from the cache is a lot faster than recaching different parts of the memory over and over again which happens with unmanaged languages which just put objects where there is free space.
  • Interpreting that stupid Intermediate Language is damned slow
    .Net doesn’t interpret its IL, it compiles and optimizes IL runtime
  • Compiling runtime is very slow anyways
    (That compiling a C++ is slow doesn’t mean that .Net is slow) It saves a lot of time for compiling at runtime allows great optimalisations like getting rid of unreachable code and inlining depending on the current runtime variables. Also operations can be compiled with processor specific optimalisations from one IL source. Most of the resource intensive compiling is done at the startup of the application, it is done while the program is running too but that really makes it a lot faster instead of slower
  • If I write assembly myself it will be way superior to anything .Net can generate
    .Net can’t make all the optimalisations possible for it would take longer to analyse code than the optimalisation would gain. But usualy it creates still very optimised code. The big problem with writing very optimised assembly yourself is that the most optimised code is very processor specific and would be very hard to port, and even worse to maintain. Wanting to add one little extra feature could let you rewrite the whole code again. (Yes I indeed have made programs with assembly). Languages which avoid this a bit like C++ still require you to make a different build for every specific processor when fully optimising. Also it is nearly impossible to debug fully optimised unmanaged code but in .Net it still provides you with at least the functionname in which it has happened with the offset (try to accomplish that with C++ in release mode)
  • The runtime is soooo damned big, it sucks
    20 Mb’s isn’t a lot. It only has got to be downloaded once, and the .net framework is in Windows Update so everyone who updates his computer would have it installed by now. Usualy there is room enough on your software installation CD to include .net, it is more than worth those 20 mb. Also languages like C++ require certain runtimes which arent that cooperative. Does ‘DLLHell’ ring a bell?
  • The .net library naming SUCKS
    Yeah.. its naming is different than what MFC uses. At least the naming is very consistant which is way more important than ‘nice naming’, although when seeing some C++ API names used I still wonder why someone could prefer that above the clear .Net naming
  • The .net library itself sucks
    Really? Like what? What can’t it do?
  • You can’t use API calls like CreateFile
    Now I can’t…
    [DllImport("kernel32.dll", SetLastError = true)]public static extern IntPtr CreateFile(string lpFileName, uint dwDesiredAccess, uint dwShareMode, IntPtr lpSecurityAttributes, uint dwCreationDisposition, uint dwFlagsAndAttributes, IntPtr hTemplateFile);
    … now I can!
  • .Net sucks cause it is Microsoft
    Yeah, so what. .Net is a ECMA standard so you are pretty free to use it, and if there is a catch then that one hasn’t been exploited yet for on linux people are happily using mono to run .net stuff

Avalon

http://www.microsoft.com/downloads/details.aspx?familyid=C8F904E1-B4CA-402B-ACCF-AAA2BD60DA74&displaylang=en

When having Avalon installed it adds a few new project templates to my Visual C# 2005 express, with which avalon applications can be made. Avalon is a Window system which uses xml files to define a form. At the moment I couldn’t find a designer neither reference to it in the help files and assume it isn’t implemented yet, which is rather a pain for creating a form by editing Xml by hand is just a pain: Xml is hard to write and I am just too dependant on the user friendlyness of the form designer.

When compiling Avalon generates classes and serialized data files in your application to replace the Xml files.

So what does it basicly do? Lets you design your forms easily (although there isn’t a proper designer yet) maintaining a good performance by replacing the slow Xml files by generated classes and serialized resources compile-time.
Having a few test applications decompiled and having looked at the sdk it seems that Avalon can do practicly the same as the current Windows.Forms dll.
So now I wonder, in what way would Avalon be better than using the great current form designer and the Windows.Forms dll?

Variable sized floating points

A lot of selfrespecting programming languages have got Integer classes which theoraticly can have an unlimited size by using a dynamic array underneath it.
Python got it, and lots of others too.
But still, they haven’t got a variable sized floating point, which I find odd for it shouldn’t be a big problem to create:

  • Wrap one existing variable sized Integer
  • Add an integer which points to the place where the point will be
  • Add operator overrides to get it working
  • Add an integer which specifies how much numbers after the dot there will be

The later one is quite important to have got. We wouldn’t want 1 /3 causing an infinite loop.

You even could do it a second way:

  • Wrap two existing variable sized Integers.
  • Add overriden operators to get it working

These 2 integers would represent a fraction: a / b.

By storing the fraction in stead of the result you are actually very precise, although some numbers cannot be represented in this way, but can be approached.

Subtext

Subtext

I like subtext, I am just wondering how it would be possible to create a big practical project in it:

  • How would it do async operations? GUI’s; networking… If there is no difference between runtime and execution time this could be very hard to visualize
  • How would this ever perform properly? Compiling would just do but when debugging a big application it would get rather slow. When it would get dynamic, meaning that the code can drag-n-drop itself, it would be virtually impossible to get a compiled version running quick.
  • What about internal arguments? Lets make a static field in a function, and if the language wisely doesn’t support it this behaviour could also be replicated by having a file in which a number is incremented. So now we got a field in that function which number gets incremented on every call, an internal argument not known by the engine. Every function generates the same results with the same arguments but now it doesn’t and for there is no difference between runtime and developtime it would behave unpredictable.

Subtext is rather a way to create a function which is ‘instant’, it is a defenition rather than an operation. Computers can’t just do stuff instantly like read a whole file or execute operations.

I like the idea, but I just see no real practical use.

C&C Generals Scud Hack

I’ve found a rather interesting ‘hack’ for C&C generals on a forum: the Scud hack.

  1. Build a Scud Launcher
  2. Select a unit which can shoot and press Ctrl+1
  3. Select the Scud Launcher and press Ctrl+2
  4. Press 1, than press shift+2
  5. Now you’ve got the Scud launcher and the unit selected. Mouseclick on an area while holding Ctrl to let the unit and the Scud force fire there.

EA won’t fix the hack but when you use the Scud hack whilst playing a stat-online-game you’re probably get banned.

Chain emails suck & Asia

I have received 5 chain emails in my mailbox today claiming that when I forward it to a douzen other people putting my name in it would help the victims of the tsunami in asia.
How? How can miljons of emails help those who most of them haven’t got computers (anymore, or never had) to receive email!

Usualy people forward one email to 10 others at least, the count on most emails I received was 400. So lets assume that every forwards the email 10 times, and this continues for 400 times:

10 ** 400 == 1e400

That are actually more emails than people in the world, usualy people get the same mail back from someone else later in the chain.

Lets assume that 500 miljon people receive a certain chain mail à 100 KB bandwidth for the sender and receiver combined.

That makes 500 milj times 100 KB is 50 TB..

1 Gig usualy costs a provider lets say 5 cents: 5 cents times 50 TB is $2500.

If we would just don’t forward chain mails but all send a simple postcard to asia we would let them show we care and we would save $2500 for the mail providers who will lower prises, which will result in more money for users which eventually results in more money for the world economy including asia!

Never forward a chain mail

I would like to use this moment to say I am shocked by the tsunami in Asia and I do care for the miljons over there, I hope this single post will convince at least one person to stop forwarding mails, that would save an average of about $100 over some time, my donation for them.

Torrent sharing p2p network

In my previous post I discussed Exeem. Exeem is (or actually will be for it hasn’t been launched, just announced) a p2p network for sharing, rating and commenting torrents.

What is a torrent? A torrent is a small file which is used for thebittorent p2p file redistribution system to identify a certain file, or files you can download. You first need the torrent for a file/folder before you can download it.

The major problem with this is that it is impossible to use a bittorent client itself to search for the downloads you want, therefore a lot of sites have been created over time which contain huge searchable collections of torrents. One of these sites was suprnova.org, which has recently been terminated due to legal issues.

As I elaborated in my previous post Exeem probably will suck. So someone will need to do stuff right by making an alternative.

What issues would have need to be solved to create such a p2p torrent sharing network?

  • No centralized client list, most p2p networks were terminated because they had a centralised tracker to which a client connected to receive the file list and all the users available for a certain file. Instead of a centralised server every single client should tell other clients who else is in the network and what files are there. When giving every client a buildin list of IP’s it can update these by querying these for better IP’s. By rating an IP by uptime and connection bandwidth a big changing group of frequently online users could provide the other IP’s and port search queries for the rest.
  • Searching, how to handle a search query? At this moment our client is connected to a few big clients who are frequently online in their neighbourhood, lets call them super nodes for now. When we send them a search query they would look in their cache whether they got the result and if not they look in their own torrents to see whether one of those matched the query and if it doesn’t they’ll just forward it to another a-bit-smaller supernode. The problem with this method is that one query could travel a huge amount of nodes and when you are connected with a good bandwidth you are doing nothing more than passing through queries to other nodes. To solve this the query feedback (when they found a result) should contain the source along with the estamated amount of different search queries the node which had the result can provide. By doing this a shortcut can be formed by one client if it finds a node which either has a lot of files searchable or which has an enourmous cache and offcourse along with that is online often and has a neat bandwidth
  • Rating Alongside every torrent you download or expose for upload there would be a meta data file containing a description, rating and comments on the torrent itself. The problem with this system is that descriptions and ratings can change and it is very hard to keep every instance of a torrent on the whole network synchronized. It is possible to send a message through the network to the original node from which you received the file with the new comment message, or you could search for the torrent again by unique id and message the nodes found to have the torrent too. All these methods still include a lot of passing through messages.
  • Privacy
  • Client side ‘hacking’, When everyone would use the default client which automaticly selects super nodes and lets people pass through queries everything will work fine. The big problem is that it is very possible that people would start using illegal client applications which would just leech from the network. To incorpirate methods to get rid of leechers would work when most people are still using the default client, but when people massivly start using illegal clients the network won’t block itself anymore but would certainly get rid off itself for everyone is leeching. This is the major problem that could happen to this p2p network which heavily relies on the fact that everyone should help others whether they like it or not by proxying, caching and passing through various queries to maintain privacy and decentralization.

I’d be rather interested in how exeem will address these issues. I guess they would just outrule client site hacking by incorperating various encrypting tricks in their protocol.

Exeem the hype

Slashdot on exeem

Since suprnova.org has been offline due to it being illegal the main source for torrents has dissapeared and has lead to a hype around the replacement made by the original maintainers of suprnova, exeem.

First, what actually was suprnova.org. It was a site which maintained torrents for legal and illegal files. And it did even more; all torrents were thouroughly checked, commented and rated by an enourmous team of editors, making sure that the torrents on Suprnova.org were the best you could possibly find.

Because Suprnova moderated, commented, rated and checked every single torrent they offered they were without any doubt illegal. If they would have only offered user uploaded torrents with a nice disclaimer that the torrents are the property of their respective owners they would have probably gotten away with it, but they also wouldn’t have got as big as they were.

Exeem basicly works offers the same as Supernova.org, except that it is a p2p application, not a centralised website.

It basicly stores torrents and comments on these on a Peer to Peer network similar to Kazaa, which basicly makes every single user of it just as legal instead of just the main servers as it was the case with Supernova’s servers. It is very hard for authorities to punish every single user of a p2p network. There would have to be a trail for every single user which would never be profitable. Governments have tried to confict the big p2p users for this actually is profitable. The main problem is that there aren’t a lot of really big p2p users, just an incredible amount of small users who combined are even worse than a few big ones.

Exeem sounds great, exeem is an enourmous hype. But I think Exeem will suck:

  • Exeem is p2p, this will most likely cause the rating, comments and moderation on torrents go down a lot and make it less attractive for the user. When having a very secure system so that only a few people can add new torrents to the network you have to have some kind of centralised authority which will be very vulnerable to legal persueds.
  • Exeem will be addware, this will cause a lot of people to drop off. Noone wants to have addware on his computer.

Although I could be mistaking, most hypes like this one tend to turn out really dissapointing.

In my opinion the only way to get a neat new system like exeem which works cool is to get a p2p torrent redistribution network for legal purposes. Bittorent grew big for it was used to redistribute linux redists. Although it will probably will get used for illegal purposes it just would be a very handy system for legal purposes too.

More on this later…