I know this is probably possible using Streams, but I wasn't sure the correct syntax.
I would like to pass a string to the Save method and have it gzip the string and upload it to Amazon S3 without ever being written to disk. The current method inefficiently reads/writes to disk in between.
The S3 PutObjectRequest has a constructor with InputStream input as an option.
import java.io.*;
import java.util.zip.GZIPOutputStream;
import com.amazonaws.auth.PropertiesCredentials;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.services.s3.model.PutObjectRequest;
public class FileStore {
public static void Save(String data) throws IOException
{
File file = File.createTempFile("filemaster-", ".htm");
file.deleteOnExit();
Writer writer = new OutputStreamWriter(new FileOutputStream(file));
writer.write(data);
writer.flush();
writer.close();
String zippedFilename = gzipFile(file.getAbsolutePath());
File zippedFile = new File(zippedFilename);
zippedFile.deleteOnExit();
AmazonS3 s3 = new AmazonS3Client(new PropertiesCredentials(
new FileInputStream("AwsCredentials.properties")));
String bucketName = "mybucket";
String key = "test/" + zippedFile.getName();
s3.putObject(new PutObjectRequest(bucketName, key, zippedFile));
}
public static String gzipFile(String filename) throws IOException
{
try {
// Create the GZIP output stream
String outFilename = filename + ".gz";
GZIPOutputStream out = new GZIPOutputStream(new FileOutputStream(outFilename));
// Open the input file
FileInputStream in = new FileInputStream(filename);
// Transfer bytes from the input file to the GZIP output stream
byte[] buf = new byte[1024];
int len;
while ((len = in.read(buf)) > 0) {
out.write(buf, 0, len);
}
in.close();
// Complete the GZIP file
out.finish();
out.close();
return outFilename;
} catch (IOException e) {
throw e;
}
}
}
Even though Amazon S3 has most of the features of a full-fledged web server, it lacks transparently supporting GZIP. In another way, you have to manually compress the files using GZIP and setup the Content-encoding header to GZIP.
With S3 Browser you may automatically compress and/or encrypt files before uploading them to Amazon S3 and automatically decompress and/or decrypt them after downloading.
There are three ways in which you can upload a file to amazon S3.
When you upload large files to Amazon S3, it's a best practice to leverage multipart uploads. If you're using the AWS Command Line Interface (AWS CLI), then all high-level aws s3 commands automatically perform a multipart upload when the object is large. These high-level commands include aws s3 cp and aws s3 sync.
I would use something like the following:
ByteArrayOutputStream byteOut = new ByteArrayOutputStream();
GZipOuputStream gzipOut = new GZipOutputStream(byteOut);
// write your stuff
byte[] bites = byteOut.toByteArray();
//write the bites to the amazon stream
You are writing the zipped values out to the byte stream, then taking the byte values, you can write those to your other stream. You can also wrap the stream to the amazon site (i.e. the output stream from the http connection or something similar) and avoid the whole ByteArrayOutputStream.
Edit: I noticed your last sentence - bleah. You can take the bytes you created, create a ByteArrayInputStream with them, and then pass that in as an input stream:
ByteArrayInputStream byteInStream = new ByteArrayInputStream(bites);
It should read from the input stream to the output stream, if I am understanding what you are describing correctly. Otherwise, you can simply write to the output stream.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With