Hi, currently the bit shift algorithm in i2c-algo-bit.c is using at least 27 udelays per byte (3 per bit and another 3 per byte). Can this be optimized for a block transfer? Does one need to check for a timeout on each bit or is there a persistent error flag that can be checked after a larger transaction? The background is an i2c interface on a TV capture card which is used for a firmware upload (a Hauppauge WinTV 150). The firmware is 14kB and takes ~4-5 seconds of upload time on 100kHz. -- Axel.Thimm at ATrpms.net -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://lists.lm-sensors.org/pipermail/lm-sensors/attachments/20050524/0fe6b125/attachment.bin