Data model and algorithm

A data model is a strategy for composing and storing data. A data model encourages us and organises data according to administration, access, and use, much like the Dewey Decimal System does for books in a library. Big Data is information that is so large that it cannot fit in the primary memory of a single machine. The necessity to process big data using expert algorithms occurs in areas such as Internet search, traffic monitoring, AI, logical figuring, signal behaviour, and a few others.

Big data is data that is too large to fit in a single system's main memory, and the necessity to process big data using organised methods arises in machine learning, scientific computing, signal processing, and Internet search, network traffic monitoring, and other domains. To turn data into useful information, advanced tools (analytics and algorithms) must be used.


  • Track 1-1 Data Stream Algorithms
  • Track 2-2 Randomized Algorithms for Matrices and Data
  • Track 3-3 Algorithmic Techniques for Big Data Analysis
  • Track 4-4 Models of Computation for Massive Data

Related Conference of Alternative Healthcare