Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

I want to load multiple images very fast on a website, what's the best method?

UPDATE: This question is outdated, please disregard

So.. my idea is to load a full manga/comics at once, with a progress bar included, and make sort of a stream, like:

  • My page loads the basic (HTML+CSS+JS) (of course)
  • As done, I start loading the imgs(the URLs are stored on JS var) from my server, one a time (or some faster way) so I can make a sort of progress bar.
  • ALTERNATIVE: Is there a way to load a compresses file with all imgs and uncompress at the browser?
  • ALTERNATIVE: I was also thinking of saving then as strings and then decode, they are mostly .jpg
  • The images don't have to show right away, i just need the callback when they are done.

XTML and HTML5 is acceptable

What is the fastest way to load a series of images for my website?

EDIT Since @Oded comment.. the question is truly what is the best tech for loading images and the user don't have to wait everytime is turns the 'page'. Targeting a more similar experience like when you read comics in real life.

EDIT2 As some people helped me realize, I'm looking for a pre-loader on steroids

EDIT3 No css techs will do

like image 685
Fabiano Soriani Avatar asked Feb 26 '10 15:02

Fabiano Soriani


3 Answers

If you split large images into smaller parts, they'll load faster on modern browsers due to pipelining.

alt text

like image 71
Chris Dennett Avatar answered Nov 06 '22 07:11

Chris Dennett


ALTERNATIVE: Is there a way to load a compresses file with all imgs and uncompress at the browser?

Image formats are already compressed. You would gain nothing by stitching and trying to further compress them.

You can just stick the images together and use background-position to display different parts of them: this is called ‘spriting’. But spriting's mostly useful for smaller images, to cut down the number of HTTP requests to the server and somewhat reduce latency; for larger images like manga pages the benefit is not so large, possibly outweighed by the need to fetch one giant image all at once even if the user is only going to read the first few pages.

ALTERNATIVE: I was also thinking of saving then as strings and then decode

What would that achieve? Transferring as string would, in most cases, be considerably slower than raw binary. Then to get them from JavaScript strings into images you'd have to use data: URLs, which don't work in IE6-IE7, and are limited to how much data you can put in them. Again, this is meant primarily for small images.

I think all you really want is a bog-standard image preloader.

like image 6
bobince Avatar answered Nov 06 '22 08:11

bobince


You could preload the images in javascript using:

var x = new Image();
x.src = "someurl";

This would work like the one you described as "saving the image in strings".

like image 6
Heinrich Lee Yu Avatar answered Nov 06 '22 08:11

Heinrich Lee Yu