While I am able to understand the meaning of encoding and decoding from wikipedia, I am not able to understand why each programming language needs them? And if the answer is related to being able to read data from external source like a database, then why do many of them employ different encoding schemes?
E.x. Python has a default ASCII encoding
Java relies on the underlying O.S,
DB2 database has IBM-1252
Having a general understanding and awareness of the encoding/decoding process that occurs in all communications should help senders and receivers of messages pay closer attention to the intended information of the messages and avoid misunderstandings.
The purpose of encoding is to transform data so that it can be properly (and safely) consumed by a different type of system, e.g. binary data being sent over email, or viewing special characters on a web page. The goal is not to keep information secret, but rather to ensure that it's able to be properly consumed.
Encoding means the creation of a messages (which you want to communicate with other person). On the other hand decoding means listener or audience of encoded message. So decoding means interpreting the meaning of the message. For example a breakfast cereal company want to convey their message to you to buy its product.
Most people like to work with text.
However, computer storage can only work with bytes.
Encoding is the process of converting text to bytes.
Over the past few decades, many different encoding schemes have been developed for different purposes, such as brevity, compatibility, or internationalization.
Today, everything should simply use UTF8. (sadly, not everything does yet)
Programmer and user interact with simple plain text format in form of English or other human readable language but computer do not know how to deal with that.Computer can only deal with bytes so Encoding and decoding is necessary.
Quoted from http://searchnetworking.techtarget.com/definition/encoding-and-decoding:
In computers, encoding is the process of putting a sequence of characters (letters, numbers, punctuation, and certain symbols) into a specialized format for efficient transmission or storage. Decoding is the opposite process -- the conversion of an encoded format back into the original sequence of characters. Encoding and decoding are used in data communications, networking, and storage. The term is especially applicable to radio (wireless) communications systems.
The terms encoding and decoding are often used in reference to the processes of analog-to-digital conversion and digital-to-analog conversion. In this sense, these terms can apply to any form of data, including text, images, audio, video, multimedia, computer programs, or signals in sensors, telemetry, and control systems. Encoding should not be confused with encryption, a process in which data is deliberately altered so as to conceal its content. Encryption can be done without changing the particular code that the content is in, and encoding can be done without deliberately concealing the content.
The code used by most computers for text files is known as ASCII (American Standard Code for Information Interchange, pronounced ASK-ee). ASCII can depict uppercase and lowercase alphabetic characters, numerals, punctuation marks, and common symbols. Other commonly-used codes include Unicode, BinHex, Uuencode, and MIME. In data communications, Manchester encoding is a special form of encoding in which the binary digits (bits) represent the transitions between high and low logic states. In radio communications, numerous encoding and decoding methods exist, some of which are used only by specialized groups of people (amateur radio operators, for example). The oldest code of all, originally employed in the landline telegraph during the 19th century, is the Morse code.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With