ATtiny4313 on linux

Some time ago I had acquired an ATtiny2313 breakout board and was running into memory limitations.

I ordered 2 ATtiny4313 on ebay, but could compile any code for them. As it turns out after a longer odyssey, it seems that the patches Atmel applied to ‘their’ version of WinAVR were not widely know. I’ve also failed to build a working avr-gcc using the latest sources of gnu gcc and the manual provided on the avr-libc website. It compiled all right, but the resulting cross-compiler was completely unaware of the ATtiny4313.

F*CK !!

There’s also quite a heated discussion on avrfreaks.net as to why Atmel has (again) ignored the wish of the community to produce a true cross-platform tool-chain, which includes device support for everything.

There was a download/build/install script for linux to be found on avrfreaks.net as well, but it didn’t produce a working compiler for me.

F*CK !!

BUT NOW… thanks to ‘Bingo600’ this has changed. There’s a new one available here and it works ;-)

Now I’m happily coding and compiling for my new ATtiny4313 chips.

Still you’ll want to apply a patch to avrdude.conf to make it aware of that chip. Just add it to avrdude.conf after the ATtiny2313 block. I assume you’re using version 5.10 or later.

You’ll also need to make sure you have access to the programmer, quite possibly by adding udev rules accordingly. For openSUSE, you’ll need something like this for the most common ones. This file is installed automatically, if you have avrdude from the repositories on your machine. If you build it on your own, you need to add it by hand.

Have fun. Use at your own risk ;-)

Edit:

If you’re planning to go this way, you need to patch as well. It seems to use some math functions that weren’t in there before. Interestingly this only affects avr-g++ and not avr-gcc, but adding it shouldn’t do any damage. If you compile with ‘garbage collect sections’, everything that’s not needed will be thrown out again.

This also affects the ‘Arduino IDE’ + subroutines on linux.

As pointed out in one of the comments, there is a potential security risk when running the script I’ve linked to. Therefore, please memorize the next line and make it part of your ‘kernel’.

Be sure you have valid, verified and functional backups of all your vital data before running external (and potentially harmful) code on your system. The backup(s) must be on an external medium. Everything else is worthless.

The Tao Of Backup

This entry was posted in Electronics., Software. and tagged , , , . Bookmark the permalink.

3 Responses to ATtiny4313 on linux

  1. cjameshuff says:

    I hope you made note of the potential havok Bingo’s scripts can wreak before running them…minor issues like backing up toward the root of the drive, executing rm -rf * along the way. Given that the official responses when given a clear explanation of how the problem can occur and how it could easily be fixed were “it hasn’t happened before (that I know of) so it must not be a big problem”, and “I run them in a VM, so I don’t care”, I strongly, strongly suggest never running anything written by Bingo600.

    • robert says:

      You have a point there. At least the download/build phase shouldn’t require any root privileges at all.

      But when running ‘make install’, usually as root for system wide installation, you’re in the same situation again. If the makefile triggers fatal commands, they’ll just happen. You may be lucky if you have locked down your system with AppArmour/Tomoyo, but otherwise you’d be f*cked as well.

      So backups are vital…

      I do hope people make regular backups. I do. To an external hard disk for the usual stuff and in addition off-site backups for truly vital data. I haven’t been ‘hit’ yet, but as I was once responsible for all the data of a whole department at my former uni, making backups is in my blood now. It’s shocking how little people used to care for their most vital stuff.

      In this case – getting a working avr-gcc for my ATtiny4313 – I personally took the risk.

      I do hope that at the end of this process we all will have access to an updated version of this piece of software, accessible through the tested repositories of our favourite linux distributions.

      • cjameshuff says:

        I was fortunate enough to have recent backups, but still lost much of a day’s work between what was deleted and the time spent doing the restore. The scripts came with no guarantees and I’m not complaining about inconvenience or data loss, but I find his dismissive response to the issue to be completely unacceptable.

        The problem was entirely avoidable, too…the scripts simply took a fundamentally unsafe approach: untarring archives/creating directories, cd’ing into them (which could fail if the unpacking failed), cd’ing into the containing directory (which would then be the home or root directory in most cases), and executing rm -fr * (with root privileges, if you were following the instructions). They could easily be restructured so the rm is done on a temporary directory at a known location rather than wherever execution has left the working directory, and this would never happen.

        I have not looked at the new scripts to see if they have the same problem, but given their author’s attitude toward issues like the above, I have to strongly discourage anyone from running anything written by Bingo600, regardless of how well backed up their system is. At least use a sacrificial VM set up for the purpose.

Comments are closed.