As discussed in a previous posting, I’ve been musing over the development of a modernised version of the classic procedural BASIC language, especially with the Raspberry Pi in mind.

With this in mind I’ve been setting out some goals for a project and working a little on some of the syntactical details to bring structures, advanced for-loop constructs and other modern features to a BASIC language as backwardly compatible with the old Sinclair QL SuperBASIC as possible.

So, here are the goals:

  1. The language is aimed at both the 13 year old bedroom coder, getting him/her enthused about programming, and the basic needs of general scientist. (Surprisingly, the needs of these two disparate groups are very similar.)
  2. It must be platform agnostic and portable. It must also have a non-restrictive, encumbered license, such as the GPL, so probably Apache, so as to allow it to be implemented on all platforms, including Apple’s iOS.
  3. It must have at least two, probably three, levels of language, beginner, standard and advanced. The beginner would, like its predecessors in the 8bit era, be forced to use line numbers, for example.
  4. It must have fully integrated sound and screen control available simply, just as in the old 8bit micro days. This, with the proper manual, allow a 13 year old to annoy the family within 15 minutes of the person starting to play.
  5. The graphical capability must include simple ways to generate publishable scientific graphical output both to the screen and as encapsulated Postscript, PDF and JPEG.
  6. The language must have modern compound variables, such as structures, possibly even pseudo-pointers so as to be able to store references to data or procedures and pass them around.
  7. The language should be as backwardly compatible with Sinclair QL SuperBASIC as possible. It’s a well tested language and it works.
  8. The language should be designed to be extendable but it is not envisaged that this would be in the first version.
  9. The language IS NOT designed to be a general purpose application development language, though later extensions may give this ability.
  10. The language will have proper scoping of variables with variables within procedures being local to the current call, unless otherwise specified. This allows for recursion.
  11. All devices and files are accessed via a URI in an open statement.
  12. Channels (file descriptors) must be a special variable type which can be stored in arrays and passed around.

As I said earlier, I’ve been thinking about how to do a great deal of this syntactically as well. This is where I’ve got so far:

[Edit: The latest version of the following information can be found on my website. The  information below was correct at 10am 23rd February 2012.]


Variable names MUST start with a alphabetic character and can only contain alphabetic, numeric and underscore characters. A suffix can be appended so as to give the variable a specific type, e.g. string. Without a suffix character the variable defaults to a floating point value.

Suffixes are:

$ string
@ pointer

Compound variables.

Compound variables (structures) can be created using the “DEFine STRUCTure” command to create a template and then creating special variables with the “STRUCTure” command:

DEFine STRUCTure name

STRUCTure name varnam[,varnam]

An array of structures can also be created using the STRUCTure command, e.g.

STRUCTure name varnam(3)

The values can be accessed using a “dot” notation, e.g.

DEFine STRUCTure person
DIMention vitals(3)

STRUCTure person myself, friends(3)$ = “Stephen”
myself.age = 30
myself.vitals(1) = 36
myself.vitals(2) = 26
myself.vitals(3) = 36

friends(1).name$ = “Julie”
friends(1).age = 21
friends(1).vitals(1) = 36
friends(1).vitals(2) = 26
friends(1).vitals(3) = 36

As with standard arrays, arrays of structures can be multi-dimentional.

Structures can contain any number of standard variables, static arrays types and other structures. However, only structures defined BEFORE the one being defined can be used. Structure definitions are parsed before execution of the program begins. Structure variable creation takes place during execution.



FOR assignment (TO expression [STEP expression] | UNTIL expression | WHILE
expression) [NEXT assignment]
NEXT [var]

The assignment flags the variable as the loop index variable. Loop index variables are normal variables.

The assignment and the evaluation of the assignment expression happen only once, when entering the loop. The test expressions get evaluated once every trip through the loop at the beginning. If the TO or UNTIL expressions evaluate to zero at the time of loop entry the commands within the loop do not get run.

The STEP operator can only be used if the loop index variable is either a floating point variable or an integer. The expression is evaluated to a floating point value and then added to the loop index variable. If the loop index variable is an integer then the value returned by the expression stripped of its factional part (as with ABS()) before being added to the variable.


WHILE expression [NEXT assignment]



Equivalent to a FOR loop without an assignment using the WHILE variant e.g.

x = 10
WHILE x > 3 NEXT x += y / 3


is equivalent to

FOR x = 10 WHILE x > 3 NEXT x += y / 3





UNTIL expression

The commands within the loop are run until the expression evaluates to a non-zero value.

Functions and procedures.

A function is merely a special form of a procedure which MUST return a numeric value. The suffix of a procedure determines its type, in the same way as variable names.

DEFine PROCedure name[(parameter[,parameter[…]])]

[RETURN expression]

DEFine FUNction name[(parameter[,parameter[…]])]

RETURN expression
END FUNction

Parameters are local names with reference the passed values by reference. This means that any modification of the parameters within the procedure will change the value of any variables passed to it.

Variables created within the procedure will be local to the current incarnation, allowing recursion. Variables with global scope are available within procedures but will be superseded by any local variables with the same name.

Joining the fast lane: Fibre to the Cabinet broadband Internet access is here.

Well, after quite a wait the Cowley BT telephone exchange has finally been enabled for Fibre to the Cabinet (FTTC) broadband. Even using BT’s own estimate, the exchange has been nearly two month late coming on-line.

So, what does having the new service involve?

Well, other than a hefty £80+VAT fee, it merely requires a BT Openreach engineer to visit your house and install a modem and additional face-plate filter onto the house’s master “line” socket and then go to the street cabinet containing your connection to rewire it. You will also need a firewall/router which can talk PPPoE. In other words, one which can use a network cable instead of a phone cable. These are the same as those used with Virgin Media cable-modems.

Although BT (via your ISP) will inform you that the process will take up to an hour, in fact it takes a lot less time than this, It’s about 5 minutes for the engineer to unpack the new modem and fix the faceplate and then a further 10 minutes while he hunts for the correct street cabinet and re-wires your phone line. Assuming that you have your router fully set up beforehand that’s it. He just does a few tests and leaves.

In my case, I had a Billion BiPAC 7800N router which can do both ADSL (phone line) and connect via a network cable so all I needed to do was change a setting and reboot it.

So, this, after some tidying, is my new communications system:

Now that everything’s wall mounted and I’ve put all the wires into a conduit it looks a whole lot neater than before. Also, it’s unlikely to be knocked or cables snagged.

At the top of the picture you can see the Billion router. It’s not much to look at but it is a superb router. I do like the way that it can be mounted vertically on the wall, thus taking less space laterally.

Below the router is the BT modem. Thankfully this is the mark 3 model so is less likely to die horribly.

Finally, connected directly into the wall power socket is the Devolo 200Mb/s power line networking module. This connects to a similar unit in the spare bedroom, where my server sits, and to a multi-port power-line network switch in the living room to which is connected the TV, PS3 and amplifier.

So, what does all this shiny new equipment give me over and above what I had before? Other than the 10 times download speed increase and the four times upload speed jump, it also means that the connection should be far more stable. I’m also only paying about £3 more for this service than I was for the ADSL MAX service I was previously on and I get an extra 30GB of download quota bundled in with it.

Basically, I’m happy with it and that’s all that matters.

[Edited to add historical broadband speed test data]

Surviving the stigma: The under cover geek.

A recent experience has made me aware at how much pressure I feel about hiding my geekiness and how I’ve gained over the years mental defence mechanisms and automatic self-censoring of my expression so as to seem “normal” around others in social situations.

I have become aware that I have a semi-concious editor metaphorically sitting on my shoulder observing me and suggesting ways to avoid mentioning anything to do with SciFi stories, quotes from films or technology unless someone else mentions it first. There’s a constant feeling that I should hide this side of me.

I think a great deal of this comes from my experiences at school. I was never in the in-crowd. In fact, I was often shunned and left out of all groups and bullied. This meant that I gathered quite a few coping strategies over the years, one of which is an automatic dulling, if not total suppressing, of all emotion if I find myself in a stressful situation. It’s very probably a very unhealthy thing to have but it was the only defence I had so as to cope in many of those years.

During the recent experience I mentioned earlier I found myself suppressing the geek side of myself more and more, fearful that the person I was trying to impress would not accept that side of me. This became highly stressful.

I’d like to be able to release my inner geek and have “geek pride” but the defence systems are now so firmly ingrained that I fear I can never be rid of them, unless in the company of know geeks.

So, why does society have this reaction to geeks? Well, I imagine that it’s because they are different. The normal will always look on the slightly eccentric with suspicion. Of course, the most extreme end of geekiness can be rather anti-social, but so can the extreme end of the normal womaniser or the normal alcoholic.

Anyway, it’s sad, but that’s the way society is. I’ll just have to try to deal with it and interface with it on its own terms.


Musings on Internet dating.

Having tried Internet dating for some time and getting absolutely nowhere it was interesting to read about some research done in the USA into how this technology is being used and seen by those using it. The results definitely resonated with my experience and those I’ve talked to who have also tried it.

From my own experience and that of those of the opposite sex I’ve talked to (except for a couple of notable exceptions) it seems that the great majority of both men and women who frequent the sites treat it more or less like an on-line shopping experience, with all the consumer ideas of the perfect product which this entails.

There does seem, however, to be a marked difference between what the two genders seem to be looking for in this retail experience.

As for the women, there seem to be two types on the sites, those looking in the most part for a perfect, boxed, shrink-wrapped “bFriend 2” with money-back guarantee and those with children who seem to be looking for an emotional crutch and child minder. Those who are the most picky seem to be the ones who have been through a divorce. This is understandable in a way as they don’t want to submit themselves to the same hurt and pain as they’ve experienced in the past.

The majority of the men are looking for something very different. They seem to see dating sites as seedy singles bars with needy women ready to do anything for “love” and this, from the shocking stories I’ve heard, mostly seem to involve kinky sex.

So, what about these sites which purport to offer a “scientific” matching system. Well, as the report mentioned above put it, they’re not exactly scientific. I’ve filled out the whole slew of personality profile questionnaires and very often the “matches” it gives (from the very small selection available) are very often laughable.

In my experience, the sites which have put the most emphasis upon complex matching have been the least able to supply any meaningful connections. Indeed, in my experience, the site which makes the most of its matching ability and advertises so on the TV never once gave me any matches who would respond to a simple “Hello”, let alone actually meet someone.

In addition, the advertising intimates that there are thousands of members of the opposite sex just waiting to get in touch with you when you join. Well, there are thousands of members, possibly thousands which match your criteria globally. However, if you trim it down to those who are actually within a realistic distance to make the logistics work and then filter out the vast majority of the members who are inactive (or merely spam-bots) you get down to a very, very small number, in the low hundreds, more likely in the tens. New member seem to appear in quite small numbers too, probably no more than five a week, on a good week, and most of those become disillusioned so quickly that they can be thought of as inactive members.

What’s even more problematic is that all these sites charge an exorbitant amount of money per month for this “service”. I have no gripe with paying for a service but some of these sites charge nearly £20. There is no way that a simple web interface to a database and some simple data extraction logic should cost this much to run. It’s close to being a scam.

So, in these days of isolated, static social circles what is the possible alternative for those who wish to find a partner? Actually, I’m not sure these is one. Internet dating seems to be the only game in town, even if it it is close to useless.


Mars attacks

Well, Mars is finally getting into a position where I can image it at a reasonable hour of the day (or night).

So, ths evening I had a doze in bed for most of it before getting outside and attempting to get at least one image of Mars.

Mars is rather more difficult to image tham Jupiter due to it being far smaller. Thankfully, for Christmas I got a 5x magnification lens which allows me to make the image on the “webcam” camera large enough to even attempt the task. Even so, with the degree of atmospheric disturbance I never thought I could get anything decent tonight.

Well, surprisingly, after a bit of processing in Registax, this is what I managed to obtain:


Astronomical observatory: News update

I’ve been rather quiet about the observatory and astronomy in general in this blog for quite some time now. (Actually, looking back it was last June.) So, what’s been going on?


When the weather has allowed I’ve been out a number of times, the most fruitful being around the end of October when Jupiter was close to opposition and hence the largest it can be.

During this time the weather was helpfully quite good with a number of days of exceptionally stable air giving rather good viewing conditions (known as “seeing”). which gave me a good opportunity to try out a time-lapse video of Jupiter turning.

It took a few hours to image Jupiter using a modified web-cam device and then a further half a day to process each of the video clips into single images such as this:

Each of the images was then again put into Registax, aligned and then made into a time-lapse movie:

Which does look rather impressive.

Unfortunately, after that period the weather hasn’t been very kind and Jupiter has drifted further away from us as we journey around the Sun. Still, Mars is on its way!

After the purchase of a new telescope at the new year (more later) I’ve managed to get out only a couple of times, mostly to do tests. The best image so far has been of the Orion Nebula taken by imaging a number of times with my Nikon D90 and then combining the sub-frames into a final image:

It could be better. The telescope mount wasn’t fully calibrated, I need a light pollution filter and the stars were twinkling wildly due to disturbed air. Still, it’s a start and far better than I could achieve with the Meade LX90.


Most of my time in the observatory has been spent trying to find ways to stop water seeping in and causing mould to form. In the end I had to spray the whole of the outside of the base with a rubber sealant paint and drill some drain holes in the roof roller rails. These seem to have mitigated the problem but I’ll know better when there’s some more heavy rain.

Other than that, the biggest news is the purchase of a new telescope with a “German” equatorial mount. The fact that the new optical tube is larger too is a side issue as the difference in price between the sizes wasn’t that great and so the biggest which would fit the observatory and the budget was the one I went for.

The reason for the change was two-fold. Firstly, the Meade LX90′ mount is rather crude and wanders all over the place, making it useless for even medium length imaging of deep-sky objects. Secondly, the fork mount, when combined with an equatorial wedge, excludes much of the sky as the mount gets in the way of the camera.

Anyway, in the end, after quite some extensive research I decided upon the Celestron EdgeHD 1100, 11″ scope. That, along with the hardware to fit it onto the observatory’s pedestal and guide scope, cost a pretty penny.

It was only after I mounted it in the observatory that I found the one big problem with the guide scope (other than the weight) and that was that it won’t fit in the observatory. (In fact, the main scope only just fits when it’s at its easterly range of travel.) Still, other than that “Doh!” moment everything’s fine.

The new ‘scope at home.

Well, that brings things up to date really… More to do in the future.

My 30 year personal computing odyssey… So far.

The Journey Begins.

Sinclair ZX81

It was almost precisely 30 years ago today that my journey into the world of computing began. I remember the day that my parents bought the Sinclair ZX81 which was to become my Christmas present, we’d gone to Bedford to buy it in W.H.Smiths and it came in a brown cardboard box with nothing printed on the outside. We’d then all got into the car and whilst we drove up past St. Neots towards some shop on the Cambridge road I was able to open the box and start to read the manual. (We didn’t find the shop in the end and I can’t remember what it was supposed to be selling. Instead we turned back at a small roundabout and drove home.)

At the time I thought of computers as literally magical things. I’d seen them on “Tomorrow’s World” where a year or so before they were extolling the new technology which now cost less than a thousand pounds (showing the TI 44/9). Other than this I’d only seen computers on “Horizon” or in science fiction but here, now was one sitting in a small box on the back seat of my parents’ car beside me. I also marvelled at the ZX81 manual with its painting of a science fiction inspired landscape. (Why are computer manuals so much more boring these days?)

As for programming, at this time I’d only overheard conversations from my class mates at my new school who had had some lessons in the science block. They talked of mystical incantations and something to do with “print, print, print.”

Of course, this being a Christmas present, once we got home it was put away in a top cupboard, out of sight. But still, that was the beginning of the journey.

One Small Computer For A Man…

And so, Christmas Day came and I was at last able to get my hands on the ZX81. It was set up on a chrome steel and glass coffee table and connected to our old “Elizabethan” 12″ black and white portable TV which we’d used in the caravan on holiday. I already had a “Binatone” cassette recorder, which I remember getting for my birthday in ’77, but at this point it wasn’t able to be used as I had no tapes with software on. However, the Christmas of 1981 was spent cross-legged tapping at the flat plastic membrane keyboard typing in the examples from the manual.

It wasn’t long, however, until I soon hit the limit of the 1K memory, so my progress stalled for a while. It wasn’t until my birthday in February that I managed to get the 16K RAM Pack. Wow! How could anyone fill a whole 16K?! Well, I certainly couldn’t.

Anyway, at this point I think I should start compressing the time scales otherwise this post will become a book. Suffice it to say that the ZX81 was my mainstay computer for a further 15 months and it taught me the basics. It also taught me how to be patient after spending one and a half days typing in hex code out of the “Your Computer” magazine only for a thunderstorm to wipe out my work. A further two days of typing later and a rudimentary “Space Invaders” game was ready to play, which worked for about a minute until it crashed due to a typo somewhere in the pages of code.

The Steady March of Progress.

In the May of 1983 I finally persuaded my Dad to help me buy a replacement computer, a ZX Spectrum 16K. At the time this cost a huge amount, £125. Well, at the time £125 WAS a lot of money, at least for my family. Of course, the timing was awful as only a couple of weeks later Sinclair dropped the price of the Spectrum so that £125 would get your the 48K model. Later in the year I sold the ZX81 to one of my Dad’s work mates so I could buy a Fox Electronics 48K upgrade as many of the games I wanted to play by now required the larger memory. (Can you remember when games were all £4.99? Wasn’t it a scandal when they suddenly jumped to about £6 a pop?! :-)) I later bought the ZX81 back from the person I sold it to for a profit and it’s now in my loft.

The Speccy was the machine upon which I did most of my first real world work. This was helped by the addition of the Interface 1 and ZX Microdrives in the summer of ’84 along with the first printer, a Brother HR-5 thermal ribbon printer which could output at an amazing 30 characters per second. This combination took me right through to half way through my degree, upon which I wrote most of my essays using the “Tasword 2” word processor.

During this period I made my first computer purchase mistake. During the latter months of 1984 I had been reading “Your Computer” magazine and getting more and more enthused about the Memotech MTX series machines. They were sleek (for the time) and they even professed to have a ZX Spectrum emulator in the works. Best of all, they had a built in debugger/assembler/disassembler on board just like the “professional” RML 380Z I’d seen and used at school. How could it be bad?

So, after saving up my student grant (yes, they were magical things too) by basically not having a social life in the first term at Uni. (this wasn’t a concious decision) I spent £199 on a MTX500. This was a very bad move. The machine itself was OK, but being basically an MSX machine but without the compatibility and software being expensive and hard to come by it was a bit of a lemon. The Spectrum still got more use.

And On, Into The Future.

Sinclair QL

In the January of 1986 I managed to convince my Mum that I needed something more capable to do my University work upon and so along came the Sinclair QL.

This was a major leap forward. Not only did it come with a full office suite of programs, including a word processor, spreadsheet and database application, but it also had a procedural BASIC programming language and pre-emptive multitasking. i.e. Welcome to the modern world.

Suffice it to say that this machine was invaluable for my University work, not only as a word processor upon which I wrote my degree mapping project report (I won’t go into the story of the power cut in the halls as I was writing the conclusions) but it was also used to write programs to do some of the project work, such as normative mineral analysis and plotting up data for the remote sensing coursework.

It was also the machine which really got me into low level programming and assembler. QDOS is/was a beautiful and simple operating system to code assembler on and Motorola M68000 assembler is really quite high level, the combination of which made it simple to write programs. The high-water mark of which, for me, was a full emulation of the University College London BBC Micro terminal emulator engineered from their documentation. It was a combination of a DEC VT52 emulator and a Tektronix T4010 graphics terminal emulator with access to the BBC’s *FX commands.

The QL also acted as a my development machine for many projects during my MSc in Computer Science, especially those involving assembly coding. In a way, this is THE machine I learnt the most from.

Onwards and upwards.

I’m now going to speed up a gear and skim past my first floppy disk drive in ’87, the second hand BBC Micro to play Elite in the December of the same year and even the Atari 520STM in the summer of ’88. No, the next “big thing” was the first hard disk drive in 1989.

It was a revolution! You could store huge amounts! It was fast! It was expensive! Wow!

Actually, other than the first and statements these would seem laughable today. The device was a 28MB drive for the Atari ST and cost a whopping £400. In today’s money you should probably at least double that figure. Today 28MB would seem like a pitifully tiny amount of storage, enough to hold a couple of images taken with a digital camera, but it seemed cavernous. This was helped by the fact that the ST could only use a modified version of the FAT12 file system and the hard disk drivers could only use disk partitions up to 4MB in size!

Oh, and as for the the statement, “it was fast”, well all things are relative. There was a disk speed testing program which came with the disk utilities which could measure the sped of your drive. Bear in mind that this drive was a Seagate SCSI device… the maximum read speed was about 600K/s and writes maxed out at about 400K/s! Today I am getting similar speeds from my ADSL connection and I’m not that close to the exchange.

The Technological Slow Down.

Up until now it seemed that every year brought a new wonder. Indeed, with the arrival of first Minix and then MiNT on the Atari ST and TT030, I was getting closer and closer to having a UNIX box in the house. 

My home computing before the PC era.

Before the attack of the IBM PC clones

Actually, in 1993 I picked up a Sun 3/80 via Alec Muffett and then purchased for about £500 a Seagate 425MB hard disk to get it to run and then I DID have a UNIX machine at home. Things were looking up! 

After the PC revolution

My home computing set up in 1995, after the arrival of my second PC. 486DX2-66.

It wasn’t until 1994 that I made my first steps in the “PC” world, picking up the bare bones of a 386SX machine and then sourcing the components to make a working system so that I could try out this new Linux thing and play with Microsoft Windows. Overall I think it cost me another £500 or so to get it running.

Still, it was essentially the end of the “boost phase” of home computing as far as I was concerned. At this point I had effectively, be it in primitive form, everything I have here today. I had a network (10Base2), UNIX and Linux machines, a Windows box and Internet connectivity (albeit via dial-up modem). From then on it was merely a case of a gradual improvement in speed and usability.


Enter the Age of the iDevice.


Yes, I can say that we have now entered a new phase of the computing story. It’s both a very good and a very bad thing.

Effectively, for me this was preluded many years before I got my first Apple when I got my Palm Pilot Pro and mobile phone (Motorola MR30 brick) in ’97. But it wasn’t really a revolution until I got my first smartphone in 2003, a Handspring (later Palm) Treo 600. It only had GPRS connectivity but it was e-mail on the go! It had limited web browsing. It was amazing at the time. (It also had amazing battery life as well, but that’s another story.)

But it wasn’t until I got the iPhone 3G that I really found how mobile connectivity should be. Simple, sleek, quick and it “just worked”. The iPhone 4 was just as good.

However, the bad thing about all these devices is the way that the iDevice simplifying of devices is starting to intrude onto the desktop (and laptop) devices. Locking the users out of being able to access and program them. It’s almost as if you’re only buying the privilege to hold and use the devices rather than own them. This is a potentially slippery slope.

Anyway, I’ve been rambling on for far too long now. So, I’ll conclude this piece and look forward to hopefully another 30 years of the odyssey to come. I think it’s going to be even more evolutionary rather than revolutionary.

[Edit: 7:50pm 12th November, 2011. : Replaced stock image of Sun 3/80 with image of my computer set-up in 1994 and 1995.]

And it ended one year ago…

Yes, it’s now a whole year since I put away the kilts for the last time as normal, everyday wear.

At the time I remember being relieved that the month was over as the tyranny of having to wear the outfit was highly annoying. Well, also the fact that wearing long, woollen socks in the summer is not very comfortable.

I must admit that the year has flown by, probably due to the upheaval at work with the move to the new building, which still has some repercussions. Other than that, I’m not sure a great deal has changed in my life.

As for whether I’ll ever wear any of the kilts as normal wear ever again, I’m coming to the conclusion that it’s probably highly unlikely and I should start to think about getting rid of them. Even though they are comfortable to wear, if not driving, the perceived social pressure to conform and not wear anything unusual does feel quite strong.

Anyway, it’s probably time to stop thinking about it and leave it in the past…

P.S. Almost as soon as I posted this I got a reaction on Facebook saying, “but…. you looked so good in one!”

But did I really?

Me in a kilt

On the fly VMs: Viable security model for downloaded apps?

I’ve been thinking… always quite dangerous I know…

I woke up early this morning and couldn’t get back to sleep and for some unknown reason I started thinking about downloaded applications and how to prevent trojans getting a hold. Then it came to me, why let the application have real access to the system, especially the filesystem?

I started wondering how feasible it would be to modify the operating system to create on the fly a virtual machine which is a clone of itself within which an untrusted application is run. This VM would not have any real write access to the filesystem but instead would have a copy-on-write shadow copy of the real one. For performance reasons it would have to have pretty transparent access to the graphics sub-system but this shouldn’t be too high a security risk. Once the application had terminated the filesystem write operations could then be vetted and a risk assessment and “reputation” for the application could be determined before actually making the changes to the real data on the disk.

Later on the application could either be manually unrestricted or, if it’s “reputation” was above a certain threshold, unrestricted manually.

Anyway, it was just a thought.

[Edit] More thoughts added as a comment.

Google+: Cooking with the curate’s egg?

About a week ago I managed to get hold of an invitation to Google+, the new, not quite publicly available, in development, nascent social site Google are toying with. It’s got quite a “buzz” campaign running about it at the moment and all the Technorati are flocking to use it. But is it any good? Or, more importantly, could it become good enough to win main-stream users from Facebook?

Well, it does have a lot going for it. For a start the interface is clean and the management of the social groups is light years ahead of Facebook’s. There are issues with some of the privacy decisions made in the design, such as limited circulation posts becoming visible to those outside the initial distribution is one of the people within the circle posts a comment with public distribution. However, these are teething problems and the site is still very much under development.

There is currently no API for external applications to be built, such as games. For some people this is a major problem, for others it’s a blessing. It has been stated that a development system is being developed so I don’t see this as a road block in future.

The feel of the site has one major down side for a social site currently. The whole experience seems quite solitary. This isn’t because of the lack of people to “friends” with but more that you have no idea if any of your friends are currently on-line. You may not want to interact with them there and then but it’s nice to know that they’re about.

The other problems I see currently is that Google+ seems to be mostly gluing other Google services together. The imaging uploading and sharing is done using Picasa, which isn’t ideal for the posting of quick images on the go from a smart phone. The messaging service is a poorly integrated link to Google Chat.

One of the most interesting new facilities which could actually make people prefer Google+ over other systems could be the “Hangout” audio/video conferencing and chat sub-system. However, this is crippled by two problems currently. The first one is related to the fact that you don’t know who’s on-line at the moment. i.e. you can’t just invite those you know who are around for a chat, you have to invite blindly. The second one is that you have to download and install a plug-in for your browser for it to work.

So, do I think that it could rival Facebook in the end. Hmm… at the moment I’m not sure. There are currently too many things which make it less immediate and interactive with regards to interacting with your friends. Also, currently the reliance on glued on functionality from other Google services which don’t quite match with a social sharing system could well be a long-term problem.

So there you have it, at the moment it’s a curate’s egg, good in parts. I don’t want to damn it so early in its development but I am a little worried that the early reputation may stick. Let’s hope it does come to rival Facebook as that needs competition, especially as the developers seem to be getting into the Firefox and Gnome developer’s mind sets and changing things for change’s sake and seeing themselves as the only arbiters of good design.