Norway


I am researching what technologies and platforms are suitable for big challenges with large files 5-50GB.

Sensor data is generating huge measurement files of 5-50 GB per minute. As mentioned by those articles :
https://devblogs.nvidia.com/training-self-driving-vehicles-challenge-scale/
http://stefanradtke.blogspot.com/2017/02/how-isilon-addresses-multi-petabyte.html

In most use cases and hadoop based solutions it seems that they deal with collections of small files

The challenges are many and am not sure how to wrap my head around it.
Writing this data to disks? How to process such large files? How to move the data from sensors to the cloud or storage? How to make the data accessible from multiple geographical locations?

Any technologies or companies suitable or addressing these problem?



Source link

No tags for this post.

LEAVE A REPLY

Please enter your comment!
Please enter your name here