NotSoBASIC

As discussed in a previous posting, I’ve been musing over the development of a modernised version of the classic procedural BASIC language, especially with the Raspberry Pi in mind.

With this in mind I’ve been setting out some goals for a project and working a little on some of the syntactical details to bring structures, advanced for-loop constructs and other modern features to a BASIC language as backwardly compatible with the old Sinclair QL SuperBASIC as possible.

So, here are the goals:

  1. The language is aimed at both the 13 year old bedroom coder, getting him/her enthused about programming, and the basic needs of general scientist. (Surprisingly, the needs of these two disparate groups are very similar.)
  2. It must be platform agnostic and portable. It must also have a non-restrictive, encumbered license, such as the GPL, so probably Apache, so as to allow it to be implemented on all platforms, including Apple’s iOS.
  3. It must have at least two, probably three, levels of language, beginner, standard and advanced. The beginner would, like its predecessors in the 8bit era, be forced to use line numbers, for example.
  4. It must have fully integrated sound and screen control available simply, just as in the old 8bit micro days. This, with the proper manual, allow a 13 year old to annoy the family within 15 minutes of the person starting to play.
  5. The graphical capability must include simple ways to generate publishable scientific graphical output both to the screen and as encapsulated Postscript, PDF and JPEG.
  6. The language must have modern compound variables, such as structures, possibly even pseudo-pointers so as to be able to store references to data or procedures and pass them around.
  7. The language should be as backwardly compatible with Sinclair QL SuperBASIC as possible. It’s a well tested language and it works.
  8. The language should be designed to be extendable but it is not envisaged that this would be in the first version.
  9. The language IS NOT designed to be a general purpose application development language, though later extensions may give this ability.
  10. The language will have proper scoping of variables with variables within procedures being local to the current call, unless otherwise specified. This allows for recursion.
  11. All devices and files are accessed via a URI in an open statement.
  12. Channels (file descriptors) must be a special variable type which can be stored in arrays and passed around.

As I said earlier, I’ve been thinking about how to do a great deal of this syntactically as well. This is where I’ve got so far:

[Edit: The latest version of the following information can be found on my website. The  information below was correct at 10am 23rd February 2012.]

Variables.

Variable names MUST start with a alphabetic character and can only contain alphabetic, numeric and underscore characters. A suffix can be appended so as to give the variable a specific type, e.g. string. Without a suffix character the variable defaults to a floating point value.

Suffixes are:

$ string
@ pointer

Compound variables.

Compound variables (structures) can be created using the “DEFine STRUCTure” command to create a template and then creating special variables with the “STRUCTure” command:

DEFine STRUCTure name
varnam
[…]
END STRUCTure

STRUCTure name varnam[,varnam]

An array of structures can also be created using the STRUCTure command, e.g.

STRUCTure name varnam(3)

The values can be accessed using a “dot” notation, e.g.

DEFine STRUCTure person
name$
age
DIMention vitals(3)
END STRUCTure

STRUCTure person myself, friends(3)

myself.name$ = “Stephen”
myself.age = 30
myself.vitals(1) = 36
myself.vitals(2) = 26
myself.vitals(3) = 36

friends(1).name$ = “Julie”
friends(1).age = 21
friends(1).vitals(1) = 36
friends(1).vitals(2) = 26
friends(1).vitals(3) = 36

As with standard arrays, arrays of structures can be multi-dimentional.

Structures can contain any number of standard variables, static arrays types and other structures. However, only structures defined BEFORE the one being defined can be used. Structure definitions are parsed before execution of the program begins. Structure variable creation takes place during execution.

Loops.

FOR/NEXT:

FOR assignment (TO expression [STEP expression] | UNTIL expression | WHILE
expression) [NEXT assignment]
..
[CONTINUE]
..
NEXT [var]

The assignment flags the variable as the loop index variable. Loop index variables are normal variables.

The assignment and the evaluation of the assignment expression happen only once, when entering the loop. The test expressions get evaluated once every trip through the loop at the beginning. If the TO or UNTIL expressions evaluate to zero at the time of loop entry the commands within the loop do not get run.

The STEP operator can only be used if the loop index variable is either a floating point variable or an integer. The expression is evaluated to a floating point value and then added to the loop index variable. If the loop index variable is an integer then the value returned by the expression stripped of its factional part (as with ABS()) before being added to the variable.

WHILE/END WHILE:

WHILE expression [NEXT assignment]

[CONTINUE]

END WHILE

Equivalent to a FOR loop without an assignment using the WHILE variant e.g.

x = 10
WHILE x > 3 NEXT x += y / 3

END WHILE

is equivalent to

FOR x = 10 WHILE x > 3 NEXT x += y / 3

NEXT

DO/UNTIL:

DO

[CONTINUE]

UNTIL expression

The commands within the loop are run until the expression evaluates to a non-zero value.

Functions and procedures.

A function is merely a special form of a procedure which MUST return a numeric value. The suffix of a procedure determines its type, in the same way as variable names.

DEFine PROCedure name[(parameter[,parameter[…]])]

[RETURN expression]
END PROCedure

DEFine FUNction name[(parameter[,parameter[…]])]

RETURN expression
END FUNction

Parameters are local names with reference the passed values by reference. This means that any modification of the parameters within the procedure will change the value of any variables passed to it.

Variables created within the procedure will be local to the current incarnation, allowing recursion. Variables with global scope are available within procedures but will be superseded by any local variables with the same name.

Joining the fast lane: Fibre to the Cabinet broadband Internet access is here.

Well, after quite a wait the Cowley BT telephone exchange has finally been enabled for Fibre to the Cabinet (FTTC) broadband. Even using BT’s own estimate, the exchange has been nearly two month late coming on-line.

So, what does having the new service involve?

Well, other than a hefty £80+VAT fee, it merely requires a BT Openreach engineer to visit your house and install a modem and additional face-plate filter onto the house’s master “line” socket and then go to the street cabinet containing your connection to rewire it. You will also need a firewall/router which can talk PPPoE. In other words, one which can use a network cable instead of a phone cable. These are the same as those used with Virgin Media cable-modems.

Although BT (via your ISP) will inform you that the process will take up to an hour, in fact it takes a lot less time than this, It’s about 5 minutes for the engineer to unpack the new modem and fix the faceplate and then a further 10 minutes while he hunts for the correct street cabinet and re-wires your phone line. Assuming that you have your router fully set up beforehand that’s it. He just does a few tests and leaves.

In my case, I had a Billion BiPAC 7800N router which can do both ADSL (phone line) and connect via a network cable so all I needed to do was change a setting and reboot it.

So, this, after some tidying, is my new communications system:

Now that everything’s wall mounted and I’ve put all the wires into a conduit it looks a whole lot neater than before. Also, it’s unlikely to be knocked or cables snagged.

At the top of the picture you can see the Billion router. It’s not much to look at but it is a superb router. I do like the way that it can be mounted vertically on the wall, thus taking less space laterally.

Below the router is the BT modem. Thankfully this is the mark 3 model so is less likely to die horribly.

Finally, connected directly into the wall power socket is the Devolo 200Mb/s power line networking module. This connects to a similar unit in the spare bedroom, where my server sits, and to a multi-port power-line network switch in the living room to which is connected the TV, PS3 and amplifier.

So, what does all this shiny new equipment give me over and above what I had before? Other than the 10 times download speed increase and the four times upload speed jump, it also means that the connection should be far more stable. I’m also only paying about £3 more for this service than I was for the ADSL MAX service I was previously on and I get an extra 30GB of download quota bundled in with it.

Basically, I’m happy with it and that’s all that matters.

[Edited to add historical broadband speed test data]

My 30 year personal computing odyssey… So far.

The Journey Begins.

Sinclair ZX81

It was almost precisely 30 years ago today that my journey into the world of computing began. I remember the day that my parents bought the Sinclair ZX81 which was to become my Christmas present, we’d gone to Bedford to buy it in W.H.Smiths and it came in a brown cardboard box with nothing printed on the outside. We’d then all got into the car and whilst we drove up past St. Neots towards some shop on the Cambridge road I was able to open the box and start to read the manual. (We didn’t find the shop in the end and I can’t remember what it was supposed to be selling. Instead we turned back at a small roundabout and drove home.)

At the time I thought of computers as literally magical things. I’d seen them on “Tomorrow’s World” where a year or so before they were extolling the new technology which now cost less than a thousand pounds (showing the TI 44/9). Other than this I’d only seen computers on “Horizon” or in science fiction but here, now was one sitting in a small box on the back seat of my parents’ car beside me. I also marvelled at the ZX81 manual with its painting of a science fiction inspired landscape. (Why are computer manuals so much more boring these days?)

As for programming, at this time I’d only overheard conversations from my class mates at my new school who had had some lessons in the science block. They talked of mystical incantations and something to do with “print, print, print.”

Of course, this being a Christmas present, once we got home it was put away in a top cupboard, out of sight. But still, that was the beginning of the journey.

One Small Computer For A Man…

And so, Christmas Day came and I was at last able to get my hands on the ZX81. It was set up on a chrome steel and glass coffee table and connected to our old “Elizabethan” 12″ black and white portable TV which we’d used in the caravan on holiday. I already had a “Binatone” cassette recorder, which I remember getting for my birthday in ’77, but at this point it wasn’t able to be used as I had no tapes with software on. However, the Christmas of 1981 was spent cross-legged tapping at the flat plastic membrane keyboard typing in the examples from the manual.

It wasn’t long, however, until I soon hit the limit of the 1K memory, so my progress stalled for a while. It wasn’t until my birthday in February that I managed to get the 16K RAM Pack. Wow! How could anyone fill a whole 16K?! Well, I certainly couldn’t.

Anyway, at this point I think I should start compressing the time scales otherwise this post will become a book. Suffice it to say that the ZX81 was my mainstay computer for a further 15 months and it taught me the basics. It also taught me how to be patient after spending one and a half days typing in hex code out of the “Your Computer” magazine only for a thunderstorm to wipe out my work. A further two days of typing later and a rudimentary “Space Invaders” game was ready to play, which worked for about a minute until it crashed due to a typo somewhere in the pages of code.

The Steady March of Progress.

In the May of 1983 I finally persuaded my Dad to help me buy a replacement computer, a ZX Spectrum 16K. At the time this cost a huge amount, £125. Well, at the time £125 WAS a lot of money, at least for my family. Of course, the timing was awful as only a couple of weeks later Sinclair dropped the price of the Spectrum so that £125 would get your the 48K model. Later in the year I sold the ZX81 to one of my Dad’s work mates so I could buy a Fox Electronics 48K upgrade as many of the games I wanted to play by now required the larger memory. (Can you remember when games were all £4.99? Wasn’t it a scandal when they suddenly jumped to about £6 a pop?! :-)) I later bought the ZX81 back from the person I sold it to for a profit and it’s now in my loft.

The Speccy was the machine upon which I did most of my first real world work. This was helped by the addition of the Interface 1 and ZX Microdrives in the summer of ’84 along with the first printer, a Brother HR-5 thermal ribbon printer which could output at an amazing 30 characters per second. This combination took me right through to half way through my degree, upon which I wrote most of my essays using the “Tasword 2” word processor.

During this period I made my first computer purchase mistake. During the latter months of 1984 I had been reading “Your Computer” magazine and getting more and more enthused about the Memotech MTX series machines. They were sleek (for the time) and they even professed to have a ZX Spectrum emulator in the works. Best of all, they had a built in debugger/assembler/disassembler on board just like the “professional” RML 380Z I’d seen and used at school. How could it be bad?

So, after saving up my student grant (yes, they were magical things too) by basically not having a social life in the first term at Uni. (this wasn’t a concious decision) I spent £199 on a MTX500. This was a very bad move. The machine itself was OK, but being basically an MSX machine but without the compatibility and software being expensive and hard to come by it was a bit of a lemon. The Spectrum still got more use.

And On, Into The Future.

Sinclair QL

In the January of 1986 I managed to convince my Mum that I needed something more capable to do my University work upon and so along came the Sinclair QL.

This was a major leap forward. Not only did it come with a full office suite of programs, including a word processor, spreadsheet and database application, but it also had a procedural BASIC programming language and pre-emptive multitasking. i.e. Welcome to the modern world.

Suffice it to say that this machine was invaluable for my University work, not only as a word processor upon which I wrote my degree mapping project report (I won’t go into the story of the power cut in the halls as I was writing the conclusions) but it was also used to write programs to do some of the project work, such as normative mineral analysis and plotting up data for the remote sensing coursework.

It was also the machine which really got me into low level programming and assembler. QDOS is/was a beautiful and simple operating system to code assembler on and Motorola M68000 assembler is really quite high level, the combination of which made it simple to write programs. The high-water mark of which, for me, was a full emulation of the University College London BBC Micro terminal emulator engineered from their documentation. It was a combination of a DEC VT52 emulator and a Tektronix T4010 graphics terminal emulator with access to the BBC’s *FX commands.

The QL also acted as a my development machine for many projects during my MSc in Computer Science, especially those involving assembly coding. In a way, this is THE machine I learnt the most from.

Onwards and upwards.

I’m now going to speed up a gear and skim past my first floppy disk drive in ’87, the second hand BBC Micro to play Elite in the December of the same year and even the Atari 520STM in the summer of ’88. No, the next “big thing” was the first hard disk drive in 1989.

It was a revolution! You could store huge amounts! It was fast! It was expensive! Wow!

Actually, other than the first and statements these would seem laughable today. The device was a 28MB drive for the Atari ST and cost a whopping £400. In today’s money you should probably at least double that figure. Today 28MB would seem like a pitifully tiny amount of storage, enough to hold a couple of images taken with a digital camera, but it seemed cavernous. This was helped by the fact that the ST could only use a modified version of the FAT12 file system and the hard disk drivers could only use disk partitions up to 4MB in size!

Oh, and as for the the statement, “it was fast”, well all things are relative. There was a disk speed testing program which came with the disk utilities which could measure the sped of your drive. Bear in mind that this drive was a Seagate SCSI device… the maximum read speed was about 600K/s and writes maxed out at about 400K/s! Today I am getting similar speeds from my ADSL connection and I’m not that close to the exchange.

The Technological Slow Down.

Up until now it seemed that every year brought a new wonder. Indeed, with the arrival of first Minix and then MiNT on the Atari ST and TT030, I was getting closer and closer to having a UNIX box in the house. 

My home computing before the PC era.

Before the attack of the IBM PC clones

Actually, in 1993 I picked up a Sun 3/80 via Alec Muffett and then purchased for about £500 a Seagate 425MB hard disk to get it to run and then I DID have a UNIX machine at home. Things were looking up! 

After the PC revolution

My home computing set up in 1995, after the arrival of my second PC. 486DX2-66.

It wasn’t until 1994 that I made my first steps in the “PC” world, picking up the bare bones of a 386SX machine and then sourcing the components to make a working system so that I could try out this new Linux thing and play with Microsoft Windows. Overall I think it cost me another £500 or so to get it running.

Still, it was essentially the end of the “boost phase” of home computing as far as I was concerned. At this point I had effectively, be it in primitive form, everything I have here today. I had a network (10Base2), UNIX and Linux machines, a Windows box and Internet connectivity (albeit via dial-up modem). From then on it was merely a case of a gradual improvement in speed and usability.

Until….

Enter the Age of the iDevice.

iPhone

Yes, I can say that we have now entered a new phase of the computing story. It’s both a very good and a very bad thing.

Effectively, for me this was preluded many years before I got my first Apple when I got my Palm Pilot Pro and mobile phone (Motorola MR30 brick) in ’97. But it wasn’t really a revolution until I got my first smartphone in 2003, a Handspring (later Palm) Treo 600. It only had GPRS connectivity but it was e-mail on the go! It had limited web browsing. It was amazing at the time. (It also had amazing battery life as well, but that’s another story.)

But it wasn’t until I got the iPhone 3G that I really found how mobile connectivity should be. Simple, sleek, quick and it “just worked”. The iPhone 4 was just as good.

However, the bad thing about all these devices is the way that the iDevice simplifying of devices is starting to intrude onto the desktop (and laptop) devices. Locking the users out of being able to access and program them. It’s almost as if you’re only buying the privilege to hold and use the devices rather than own them. This is a potentially slippery slope.

Anyway, I’ve been rambling on for far too long now. So, I’ll conclude this piece and look forward to hopefully another 30 years of the odyssey to come. I think it’s going to be even more evolutionary rather than revolutionary.

[Edit: 7:50pm 12th November, 2011. : Replaced stock image of Sun 3/80 with image of my computer set-up in 1994 and 1995.]

On the fly VMs: Viable security model for downloaded apps?

I’ve been thinking… always quite dangerous I know…

I woke up early this morning and couldn’t get back to sleep and for some unknown reason I started thinking about downloaded applications and how to prevent trojans getting a hold. Then it came to me, why let the application have real access to the system, especially the filesystem?

I started wondering how feasible it would be to modify the operating system to create on the fly a virtual machine which is a clone of itself within which an untrusted application is run. This VM would not have any real write access to the filesystem but instead would have a copy-on-write shadow copy of the real one. For performance reasons it would have to have pretty transparent access to the graphics sub-system but this shouldn’t be too high a security risk. Once the application had terminated the filesystem write operations could then be vetted and a risk assessment and “reputation” for the application could be determined before actually making the changes to the real data on the disk.

Later on the application could either be manually unrestricted or, if it’s “reputation” was above a certain threshold, unrestricted manually.

Anyway, it was just a thought.

[Edit] More thoughts added as a comment.

Google+: Cooking with the curate’s egg?

About a week ago I managed to get hold of an invitation to Google+, the new, not quite publicly available, in development, nascent social site Google are toying with. It’s got quite a “buzz” campaign running about it at the moment and all the Technorati are flocking to use it. But is it any good? Or, more importantly, could it become good enough to win main-stream users from Facebook?

Well, it does have a lot going for it. For a start the interface is clean and the management of the social groups is light years ahead of Facebook’s. There are issues with some of the privacy decisions made in the design, such as limited circulation posts becoming visible to those outside the initial distribution is one of the people within the circle posts a comment with public distribution. However, these are teething problems and the site is still very much under development.

There is currently no API for external applications to be built, such as games. For some people this is a major problem, for others it’s a blessing. It has been stated that a development system is being developed so I don’t see this as a road block in future.

The feel of the site has one major down side for a social site currently. The whole experience seems quite solitary. This isn’t because of the lack of people to “friends” with but more that you have no idea if any of your friends are currently on-line. You may not want to interact with them there and then but it’s nice to know that they’re about.

The other problems I see currently is that Google+ seems to be mostly gluing other Google services together. The imaging uploading and sharing is done using Picasa, which isn’t ideal for the posting of quick images on the go from a smart phone. The messaging service is a poorly integrated link to Google Chat.

One of the most interesting new facilities which could actually make people prefer Google+ over other systems could be the “Hangout” audio/video conferencing and chat sub-system. However, this is crippled by two problems currently. The first one is related to the fact that you don’t know who’s on-line at the moment. i.e. you can’t just invite those you know who are around for a chat, you have to invite blindly. The second one is that you have to download and install a plug-in for your browser for it to work.

So, do I think that it could rival Facebook in the end. Hmm… at the moment I’m not sure. There are currently too many things which make it less immediate and interactive with regards to interacting with your friends. Also, currently the reliance on glued on functionality from other Google services which don’t quite match with a social sharing system could well be a long-term problem.

So there you have it, at the moment it’s a curate’s egg, good in parts. I don’t want to damn it so early in its development but I am a little worried that the early reputation may stick. Let’s hope it does come to rival Facebook as that needs competition, especially as the developers seem to be getting into the Firefox and Gnome developer’s mind sets and changing things for change’s sake and seeing themselves as the only arbiters of good design.

Enthusing teen minds: Why today’s computers won’t create tomorrow’s programmers.

The recent 30th anniversary of the launch of the Sinclair ZX81 and the subsequent post on his blog by Jim Finnis brought back to me a recurring thought that today’s computer technology is the antithesis of that required to enthuse a teenager to want to discover and play.

The computers of the early 80s were a blank canvas. You plugged them in, switched them on and (hopefully) the input cursor blinked at you. There was no decoration, no clutter and it was something waiting for YOU to do something to it.

Not only this but with the manual which came with it a 13 year old could within 5 minutes print their name on the screen. Within 10 minutes, at least with the second generation, make a funny noise. And within half an hour he or she could have his or her name scrolling up the screen in different colours whilst making unmusical noises and annoying their parents… they were hooked!

Now, let’s look at today’s technology…

The desktop or laptop computer takes an age to start up (i.e. more than 5 seconds) and totally insulates the user from what it is.

Smartphones are usually on all the time so don’t have this problem. Similarly tablets.

They’re immediately brimming full of functionality all vying for your attention, but it’s also incredibly locked down. You can do absolutely anything… ANYTHING as long as it’s what the visionary who steered the programming teams thinks that you should want to do. Woe betide you if you want to do anything different. It’ll either ignore you or give you an unhelpful suggestion in a dialog box. You can be creative, but only in the ways you’re told you can be.

So, what about the art of programming?

Well, on tablets and smartphones forget any native fun. Apparently this is too subversive. On the desktop it’s only slightly better (and I’m not singling out any desktop OS here). What are your options?

Well, on MacOS and Linux you can open a shell window and all sorts of interpreters and compilers are available and all sorts of graphics libraries to use with them too. You would think that this would be the ideal playing ground. Sorry to burst that bubble. It’s a great playing ground if you’re already a programming expert. It’s like taking a 5 year old into an engineering workshop, sitting him down and then complaining when he doesn’t build a car as he had all the tools available to him to do it and hence it must be his fault.

No, these environments are hopeless to teach and enthuse. There’s so large an energy barrier that it’s too daunting to even try. Also, how many lines of code in one of these modern development environments would it take to do the equivalent of the following?: 

10 FOR x=1 TO 100
20 FOR y=0 TO 7
30 INK y : PAPER 7 - y
40 BEEP 1,y
50 PRINT "Noisey coloured text"
60 NEXT y
70 NEXT x

I bet you’ll find that it’s quite a large number of line of code using all sorts of weird and wonderful libraries, possibly some non-standard ones to do the sound and a whole lot of code to manage the framework to create a window with the correct attributes and define the font etc. Hopeless!

Oh, and when it comes to drawing lines and circles etc. Oh dear.

Of course a great many people think that a computer with similar functionality to the old BBC Micro or ZX Spectrum would never be able to compete in the mind of a teen when they have all that touch-screen goodness and Angry Birds to play with. I beg to differ. It was most delightfully illustrated that this is profoundly not the case in the second episode of the BBC’s “Electric Dreams” series (unfortunately not available to watch on-line) where the family was given a BBC Micro to play with. The teenage son brought his best friend home from school to play with it and they thought it was awesome. They liked that it was a blank sheet that they could make do what they wanted and not be told what they should want to do by the device. And, of course, what they wanted it to do was make silly noises and write their names on the screen in different colours. It sparked enthusiasm!

So, what can be done?

First of all we need to ignore the idealists who think everyone should start their programming life learning something worthy and object orientated. Once the kids are hooked they can learn that later. Also, that’s not how peoples’ minds work. You don’t see object orientated recipe books for a reason. Also, however annoying to the seasoned programmer, line numbers help understand the sequential way that programs work. In other words, the early 80s micro BASICs got it mostly right. BASIC does stand for “Beginner’s All-purpose Symbolic Instruction Code” after all.

Firstly, any system which is going to enthuse also HAS to have as its core functionality the “5, 10, 30 minute” teen grabbing fun element outlined near the beginning of this post. Without it the whole thing’s lost. Any system would also have to allow growth. Just as BBC BASIC allowed the nascent programmer to grow into using procedures so should any new project, and possibly more, such as variable typing, scoping etc. Line number could be made optional in an advanced mode.

Secondly, the freedom of the code itself is far less important than the freedom to discover, so any project should not use a viral license such as the GNU Public License (GPL) but instead use something such as the BSD license.

Thirdly, and helped by the above, the core should be written in a platform neutral way with the platform specific interface on top. In this case, probably the best platform to use would be the GNU compilers and specifically that implementation of Objective C with the QT libraries to interface with most operating systems (except, notably, Apple systems, especially the iPhone/iPod/iPad).

The biggest fly in the ointment with this whole pipe dream is that I just don’t have the skills to develop such a system. (Another would be getting people such as Apple to allow the system to be made available via their App Store type portals.)

So, anyone interested in starting a project? 😉

The horror! Scientific code and how not to read your arguments…

Over the years I have seen many, many examples of poor programming practise, usually kludges and quick fixes but today I saw the most horrible code for reading in command-line arguments in a C program ever. I just had to share the horror…

   if ( (argc-1) < 5 ) {
	.
	.
	.
	[ Usage error response code removed]
	.
	.
	.
   }

   /* read in command-line arguments */
           
   numFiles = (argc-1) - 6;
   sscanf( argv[ numFiles+1 ], "%s", insFileName );
   sscanf( argv[ numFiles+2 ], "%s", outFileName );
   sscanf( argv[ numFiles+3 ], "%d", &outType );
   sscanf( argv[ numFiles+4 ], "%hd", &windowStartTimeCodeword0 );
   sscanf( argv[ numFiles+5 ], "%d", &newStartLine );
   sscanf( argv[ numFiles+6 ], "%d", &newEndLine);

Now, where can I start with this? Erm, I’m a bit dumbfounded actually.

Not only does the test for the incorrect number of arguments test for the wrong number but then it uses an index from the last value to reference the other values! Of course, this means that if the wrong numbers of arguments are given then the values are put into the wrong variables. Worse, that could be read from memory the process doesn’t own.

And there’s more.. it blindly sscanf()s them into variables.

Now, you may have seen that if one argument is left off the command line the input file now becomes the executable itself and the output file is actually the input data file. This is how this came to my attention. Trying to debug the program for a student it was found that it wasn’t reading the data correctly… and the data file was mysteriously emptied of its hundreds of megabytes of data each time the program was run. Oops!

So, dear readers, have any of you ever seen a worse command line parsing code segment?