Losing the plot?

I may be getting old, but I have spent all day on some code and it has fought back.

This is code I used to be able to do in a few hours, and, ironically, have done in a few hours for almost identical code many years ago. I am only re-writing because it needs to be more maintainable and more modular and used in a different environment.

But it has not been as fun as it should be, and I even gave up on one bit and gave it to Cliff to play with. That is so unlike me.

I spent over an hour today chasing the fact I used gettimeofday(&tv) by mistake instead of gettimeofday(&tv,NULL) and left the include out that would have warned on the mismatched prototype. Stupid arse mistake if ever I made one.

The issue being the 2nd arg was random stack content as a pointer to which to write the timezone, and that corrupted the stack and crashed the forked process. Every minor change, just adding or removing printfs for debug changed it from not working to working or back again. And even valgrind, my friend in times of need, let me down.

I must be losing it.

Anyway, I'll sleep on it, and tackle the problem again tomorrow. Hopefully Cliff will have sorted some simple alaw V.23 demod for me. It is not hard, and seriously I have done this many times before, but I could just not get it to work.

With any luck, in a few days, I'll be able to blog on how wonderful this little bit of code is, but right now I feel like I am getting passed it.

P.S. Thanks for all of the "helpful" comments about use of -Wall. Obviously, after 35 years of coding, I did not know about compiler warnings, d'uh. This is, indeed, something that is normally set for everything we are doing, and the lack of it in the Makefile in question was, perhaps, the first mistake. I am used to it being set, and is on most of the Makefiles we have. The lack of such warnings, which I am used to being given by the compiler, made it that bit harder to spot the stupid mistake. I have instigated a bit of a witch hunt for any code being compiled without -Wall now though.


  1. Yes, you are getting old. Soon you'll loose the ability to type vowels nd yll b typng lk ths.

  2. is the clue in the picture perhaps ? :)

  3. It's not actually necessary to pretend it's 1985 every time you sit down at a compiler.

    Perhaps you could reinvigorate your enthusiasm for programming by looking at how languages and modern tools have evolved to prevent you needing to waste your time on the same tedious gotchas which probably plagued K&R in the 70s?

    I don't know what compiler you're using, but assuming it's something vaguely industrial strength, you can almost certainly dramatically improve your life by merely renaming all your .c files to .cpp and fixing the errors which occur. You won't be writing C++ in a way which would delight any purist (or horrify a lover of 'C'), but the compiler will at least stop you calling functions without prototypes.

  4. Sounds like you're not using -Wstrict-prototypes. You should be. (It's not enabled by -Wall. It should be. :/ )

    1. A lot of things are not enabled by -Wall which should be.

  5. Your compiler should have warned you there was no function prototype visible for gettimeofday(). No-one should ever compile with that warning turned off.

  6. Glad to see with your update that you have instigated a lack of compiler warnings witchhunt. You might also want to consider -Wextra and look at what warnings are not included in -Wall or -Wextra.

    My programming career has been characterised by my "campaign for real warnings" as I call it. Depressingly every time I change job I find another company using inadequate and inconsistent compiler warning settings, or even when I change project within the same company. It's depressing, we're professionals and as an industry we should be beyond this crap.

    I blame a lot of this on gcc. It could expose a lot of errors at a stroke by having a sensible set of warnings enabled by default.

  7. I wouldn't beat yourself up too much, being vaguely human we all have "off-days" and times where "we can't see the wood for the trees".
    If you've learnt (or re-learnt the importance of) something from your trials and tribulations, then I don't see it as time wasted - even if it is "a simple thing" :)

    Reminds me of a funny story (and I'm sure we've all got one similar) from my student days in the 90's where a room full of intelligent programmers (they were the type to be playing around with creating their own 3D engines in the early 90's before Windows 95 and DirectX) , myself (I wasn't really into PC's at the time - I had an Amiga) and this other friend of ours (who wasn't into computers) were sat in a room - we were all chatting and socialising but they had spent what seemed like hours figuring out why this mouse did not work (before PnP obviously). They were poking around with IRQ's, DMA's and fiddling around with drivers etc and generally not getting anywhere.
    All of a sudden this friend pipes up and she says "Are you sure it is plugged in???"

    While a little beer may have been involved, it didn't excuse the fact it hadn't been checked - and she didn't let go of that one for quite some time... :)


Comments are moderated purely to filter out obvious spam, but it means they may not show immediately.

ISO8601 is wasted

Why did we even bother? Why create ISO8601? A new API, new this year, as an industry standard, has JSON fields like this "nextAccessTim...