1. Data compression and parallel computation model research under big data environment
- Author
-
Yueqiu Sun, Xian Gong, and Yihe Yang
- Subjects
Lossless compression ,Transmission delay ,010308 nuclear & particles physics ,Computer science ,business.industry ,Big data ,Data_CODINGANDINFORMATIONTHEORY ,Parallel computing ,Data loss ,Lossy compression ,01 natural sciences ,030218 nuclear medicine & medical imaging ,Data modeling ,03 medical and health sciences ,0302 clinical medicine ,0103 physical sciences ,business ,Coding (social sciences) ,Data compression - Abstract
In big data environment, data loss is a crucial issue which probably will occur due to the high network traffic, transmission delay and lesser bandwidth. This problem could be solved by adopting data compression schemes. These schemes could be classified into two types based on their actions: lossless compression and lossy compression. Lossy compression changes the output which will not be the same as input. Lossless compression changes the output and produces the output same as the input data. So the network overhead could be increased. The existing fixed and variable length coding technique have high robustness but poor efficiency. The efficiency problem can be solved by using the proposed scheme called “Data compression and parallel computation research model”. This proposed model uses a more sophisticated coding technique for the data compression and increases the efficiency while reducing the delay. Simulation results have shown that the proposed data compression and parallel computation research model has the better signal to noise ratio, increases the efficiency and reduces the delay when comparing to the existing models.
- Published
- 2017