When I upload file from Nexus 6 using amazon s3 SDK some time it throws me com.amazonaws.AmazonClientException: More data read (4567265) than expected (4561427) exception.
But when I upload image from Moto G4 plus with same code it will uploaded every time.
Please help me in solving this issue.
Here is my code for reference:
private void uploadingScreenshot(String filePath)
{
File file = new File(filePath);
if (file.exists()) {
final String serverPath = S3Util.getMediaPath(Utility.MediaType.SCREENSHOT, false, "");
ObjectMetadata meta = new ObjectMetadata();
meta.setContentLength(file.length());
S3Util.uploadMedia(SharedFolderDetailActivity.this, file, serverPath, meta, new TransferListener() {
@Override
public void onStateChanged(int id, TransferState state) {
switch (state) {
case COMPLETED: {
String path = S3Constants.BUCKET_URL + serverPath;
callTookScreenshotNotifierWS(path);
}
break;
}
}
@Override
public void onProgressChanged(int id, long bytesCurrent, long bytesTotal) {
}
@Override
public void onError(int id, Exception ex) {
if (ex != null)
Log.e(TAG, ex.getMessage());
}
});
}
}
This function is used to upload file on amazon s3 server.
public class S3Util {
public static TransferObserver uploadMedia(final Context context, File file, String s3Path, ObjectMetadata objectMetadata, TransferListener l) {
TransferObserver observer = getTransferUtility(context).upload(S3Constants.BUCKET_NAME, s3Path, file,objectMetadata);
observer.setTransferListener(l);
return observer;
}
}
Instead of using the Amazon S3 console, try uploading the file using the AWS Command Line Interface (AWS CLI) or an AWS SDK. Note: If you use the Amazon S3 console, the maximum file size for uploads is 160 GB. To upload a file that is larger than 160 GB, use the AWS CLI, AWS SDK, or Amazon S3 REST API.
When you upload large files to Amazon S3, it's a best practice to leverage multipart uploads. If you're using the AWS Command Line Interface (AWS CLI), then all high-level aws s3 commands automatically perform a multipart upload when the object is large. These high-level commands include aws s3 cp and aws s3 sync.
PDF. Retrieves objects from Amazon S3. To use GET , you must have READ access to the object. If you grant READ access to the anonymous user, you can return the object without using an authorization header.
Streaming multer storage engine for AWS S3. This project is mostly an integration piece for existing code samples from Multer's storage engine documentation with a call to S3 as the substitution piece for file system.
try this answer AmazonClientException: Data read has a different length than the expected
I have also faced this problem previously, hopefully, this may help you
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With