Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

what is a data serialization system?

according to Apache AVRO project, "Avro is a serialization system". By saying data serialization system, does it mean that avro is a product or api?

also, I am not quit sure about what a data serialization system is? for now, my understanding is that it is a protocol that defines how data object is passed over the network. Can anyone help explain it in an intuitive way that it is easier for people with limited distributed computing background to understand?

Thanks in advance!

like image 232
Yang Avatar asked Mar 21 '10 10:03

Yang


People also ask

Which of the following is a data serialization system?

according to Apache AVRO project, "Avro is a serialization system".

What is the purpose of serialization?

Serialization is the process of converting an object into a stream of bytes to store the object or transmit it to memory, a database, or a file. Its main purpose is to save the state of an object in order to be able to recreate it when needed.

What is serialization explain with example?

Serialization is a mechanism of converting the state of an object into a byte stream. Deserialization is the reverse process where the byte stream is used to recreate the actual Java object in memory. This mechanism is used to persist the object. The byte stream created is platform independent.

What are examples of data serialization formats?

XML , JSON , BSON, YAML , MessagePack, and protobuf are some commonly used data serialization formats.


1 Answers

So when Hadoop was being written by Doug Cutting he decided that the standard Java method of serializing Java object using Java Object Serialization (Java Serialization) didn't meet his requirements for Hadoop. Namely, these requirements were:

  1. Serialize the data into a compact binary format.
  2. Be fast, both in performance and how quickly it allowed data to be transfered.
  3. Interoperable so that other languages plug into Hadoop more easily.

As he described Java Serialization:

It looked big and hairy and I though we needed something lean and mean

Instead of using Java Serialization they wrote their own serialization framework. The main perceived problems with Java Serialization was that it writes the classname of each object being serialized to the stream, with each subsequent instance of that class containing a 5 byte reference to the first, instead of the classname.

As well as reducing the effective bandwidth of the stream this causes problems with random access as well as sorting of records in a serialized stream. Thus Hadoop serialization doesn't write the classname or the required references, and makes the assumption that the client knows the expected type.

Java Serialization also creates a new object for each one that is deserialized. Hadoop Writables, which implement Hadoop Serialization, can be reused. Thus, helping to improve the performance of MapReduce which accentually serializes and deserializes billions of records.

Avro fits into Hadoop in that it approaches serialization in a different manner. The client and server exchange a scheme which describes the datastream. This helps make it fast, compact and importantly makes it easier to mix languanges together.

So Avro defines a serialization format, a protocol for clients and servers to communicate these serial streams and a way to compactly persist data in files.

I hope this helps. I thought a bit of Hadoop history would help understand why Avro is a subproject of Hadoop and what its meant to help with.

like image 166
Binary Nerd Avatar answered Sep 23 '22 18:09

Binary Nerd