Dr. Vaibhav R. Bhedi
Associate Professor, MCA Department, VMV Commerce, JMT Arts and JJP Science College, Nagpur.
vaibhav_bhedi@rediffmail.com
Abstract:
In view of big data analytics, The Real-Time Analytics and Unified Information Management enables the business to leverage information and analysis as events are unfolding [1]. Real-Time Analytics included proposed Intelligent Query Adviser, interactive dash board, Event processing and advanced Analytics. Intelligent Query Adviser can find contents of required analysis. Intelligent Query Adviser can determine the expected result. The Intelligent Query Adviser will be used to suggest the query depend on analyst thought. The suggested queries from Intelligent Query Adviser will be based on thought of analyst with availability of big data. So it will help analyst to know the availability of data in big data. Unified Information Management included High Volume Data Acquisition, JIT Acquisition, Multi-Structure Data, Low Latency Data Processing and Analysis Consistency. The High Volume Data must gather all data from different channels but it cannot persist and maintain all data that have received. High Volume data Acquisition may ignore and discard data. The JIT Acquisition will persist and maintain ignored and discarded data. The both will be included in interaction layer of Big data Architecture.
Keywords: BD, JIT,IQA.
Introduction
Among the many BD (Big Data) Challenges, It also found that, the BD is huge complex term and require rapid amount of data. Due to this challenge outdated applications and relational database management not up to the mark to avail facility in rapid way to user [1]. Real time analytics is the analysis of data as soon as that data becomes available. In other words, users get insights or can draw conclusions immediately (or very rapidly after) the data enters their system [2]. In order to approach Big Data and analytics holistically, it is important to consider what that means, we want to view data in terms of its requirement, qualities and improvement [1]. This includes its degree of structure, volume, method of acquisition, historical significance, quality, value, and relationship to other forms of data. These data require as per need and its qualities will determine how it is managed, processed, used, integrated and later serve to needy. There are many types of analysis that can be consequence performed, by different types of users, system or analyst, using many different methods and tools, and through several varieties of channels. Some types of analysis require current information and others work mostly with historical information.
In big data arrangement, it includes data procurement, data pre-processing and data transmission. Data collection is nothing but a techniques which collect data from sources where raw data is originate further specific processed perform on it fulfill task. When the task of data collection complete, it will be transferred to storage media for processing and analysis. Because of the several different types of data sources, the collection of different type of data may be stored in form of redundancy, and consistency, etc., and that’s why it’s become meaningless data to store. Therefore, to make the valuable data integration, Data pre-process is a valuable term under lots of circumstances to collect the data from sources, which reduce storage and also store overall data without discarding [2]. Data integration is the foundation of Big Data and Analytics, which collect data from different sources without discarding and provides platform to available data view [3]. In high volume data acquisition of unified information management, the collection of raw data from multiple channels shall utilize an efficient transmission mechanism without ignoring and discarding, later it will sent to database of big data to serve different OLAP Tools and applications.
Real Time Analytics and Unified Information Management in Big data and Analytics
In view of Big data and analytics, The Real-Time Analytics and Unified Information Management enables the business to leverage information and analysis as events are unfolding. Real-Time Analytics with Intelligent Query Adviser can determine the expected result. It includes Proposed Intelligent Query Adviser, interactive dash board, Event processing and advanced Analytics. The suggested queries from Intelligent Query Adviser will be based on thought of analyst with availability of big data. The Unified Information Management enables to store integrated data from different sources and channels to leverage information and analysis. It includes High Volume Data Acquisition, JIT Acquisition, Multi-Structure Data, Low Latency Data Processing and Analysis Consistency.

- Intelligent Query Adviser: Analysis is often a journey of finding and discovery, where the results of one query suggest the number of queries. The Proposed Intelligent Query Adviser will provide advice to user depend on query which fired against big data. The Intelligent Query Adviser is work in an expeditious manner and keep pace with users thought process and provides any advice or suggestion to analyst or user. Whenever user analysis data and try to find out expected data from big data, the analyst needs to fire the query against the big data using real time analytics. The Intelligent Query Adviser has the details of query executed by analyst. The Intelligent Query Adviser will make the queries depends on the analyst need and base on available data in Big data. Then the Intelligent Query adviser will advise or suggest the queries regarding user requirement for analysis. The user or analyst will get bunch of suggested or advised queries from available big data to make suitable decisions. The system must support this journey in an expeditious manner. System performance must keep pace with the users’ thought process [1].
- Interactive Dashboards: Dashboards provide a heads-up display of information and analysis that is most pertinent to the user. Interactive dashboards allow the user to immediately react to information being displayed, providing the ability to drill down and perform root cause analysis of situations at hand [1]. A well-built interactive dashboard provides a variety of ways to dissect data. You should be able to easily explore your data to discover a wide range of insights.
- Event Processing: Real-time processing of events enables immediate responses to existing problems and opportunities. It filters through large quantities of streaming data, triggering predefined responses to known data patterns [1]. Event Processing delivers the core capabilities for building a real-time, event-based Decision Management System—correlating events, managing decision logic, embedding predictive analytics, optimizing results, and monitoring and improving decision-making [5].
- Advanced Analytics: Advanced forms of analytics, including data mining, machine learning, and statistical analysis enable businesses to better understand past activities and spot trends that can carry forward into the future. Applied in real-time, advanced analytics can enhance customer interactions and buying decisions, detect fraud and waste, and enable the business to make adjustments according to current conditions [1]. Advanced analytics refers to a broad range of analytics that are intended to give businesses greater insight into their data than they could ordinarily [4].
- High Volume Data: High volume data played important role in data acquisition and due to this all related tools work properly to complete the task of analysis. There are so many data acquisition tools and protocols available. These tools and protocols are also open source solutions and stands for the process of data acquisition. All tools have been developed and currently working in production environments.The system must need to acquire all data whether that data belongs to high volumes or variety or velocity. It must persist and maintain all data received but it discarded or ignored data and while sum amount of data are save for some moment of time [4].
- Just-In-Time Acquisition: No matter how much volume, velocity, and variety of data is to be processed. High Volume Data procurement is not able to persist and maintain all data received. So it discarded or ignored data and while sum amount of data are save for some moment of time. As obvious this seems as drawback. To overcome this drawback JIT Acquisition is introduced. The Architecture uses the JIT method to streamline the delivery process of data. The Benefits of Just-in-Time Acquisition is a stratagem that acquires data from high volume data acquisition, which may be ignored or discarded from it while sum amount of data are save for some moment of time. Just-in-Time Acquisition is a methodology aimed primarily at reducing time within data acquisition of system as well as response time of end user. Just-in-time data delivery is focused on efficiency, while lean High Volume Data acquisition is centered on using efficiency to add value for the end user. The JIT Acquisition process adds value by increasing efficiency.
- Multi-Structure Data: Multi-structure data deals with different forms and types of data. It can be come as a result between machines and peoples after interactions. It may be happened by using web applications or social networks. It is related to organization and discovery of multi structure data. In unified information management, it has ability to search data across different forms by navigating it. It can also be improved by the ability to organize data of different forms. That can be happened using into a common schema. Using this structure of data organization, the schema can relate structured data and semi structured data. For example model number and specification is structured data and installation videos are unstructured data. The sophisticated business chances can be searched from different forms data in new way.
- Low Latency Data: It enhanced to process a very high volume of data with minimal delay (latency). These are planned to help operations that need in real-time access to make fast change in data. Data processing can occur at many stages of the architecture. In way to deal with processing arrangement of Big Data, the low latency data processed fast and efficient way.
- Analysis Consistency: When different type of people performs the similar form of analysis they must get the similar outcome and obviously they should get similar screen of output. As notice as this seems, it should not be small difference, especially if the different type of people belong to different departments or location. The analysis consistency requires architecture reliability and governance.
Result of Real Time Analytics and Unified Information Management in Big data and Analytics
The Intelligent Query Adviser will provide suggestions or advices to Analyst or User from fired query depends on the availability of data. The Analyst or User will get easily the details of Big data by observing suggestions provided from Intelligent Query Adviser. The suggestions provided from Intelligent Query Adviser to Analyst will get help to make decision about drill drown analysis in Big data. By taking the suggestion from Intelligent Query Adviser, Analyst may get fast result, so it will not time consuming for analyst or user and System performance must keep pace with the users’ thought process [1]. If analyst will not satisfy with the suggestion, it means that there is no data available regarding the analyst or user need. When Analyst will get status that there is no data regarding the requirement, the analyst would leave the big data and search in other data warehouse. So Eventually, Intelligent Query Adviser will reduce the big data load and available resources to other users for analysis.
The Unified Information Management with JIT Acquisition in Big Data and Analytics is providing solution to overcome the problem of data gathering when amount of several high volumes, velocity, and variety of data comes under one roof of big data. Previously, High Volume Data procurement is not able to persist and maintain all data received. So it discarded or ignored data and while sum amount of data are save for some moment of time. This is main lacuna of data processing in big data and analytics.
Just-in-Time Acquisition is stratagem that acquires data from high volume data acquisition, which may be ignored or discarded from it while sum amount of data are save for some moment of time. Just-in-Time Acquisition is a methodology aimed primarily at reducing time within data acquisition of system as well as response time end user. Just-in-time data delivery is focused on efficiency, while lean High Volume Data acquisition is centered on using efficiency to add value for the end user. The JIT Acquisition process adds value by increasing efficiency. It also helps end user for collect information, make analysis and take suitable decision. It keeps information dynamically through real time.
The multi-structure works with high volume data acquisition and JIT acquisition to provide fast data to web application and social networks users because of this architecture it will avoid the delay time of data processing. Analysis consistency included in this architecture to make it reliable and governance. This architecture has ability to find out and search across different type and nature of data to serve user in fast manner with proposed JIT acquisition. The JIT Acquisition process adds value by increasing efficiency and kept various types of data in data warehouse for long time. The proposed JIT acquisition fulfills data gathering process and serving high efficient quality data to user with minimal delay.
Conclusion
The advantages of Intelligent Query Adviser will provide often facility to user or analyst. The proposed Intelligent Query Adviser will be used to suggest the queries depend on analyst thought. It will determine the expected results from user thought. The expected results will depend on the availability of data in Big data. Intelligent Query Adviser will help to make decision regarding analysis. So Eventually, Intelligent Query Adviser will reduce the big data load and available resources to other users for analysis, if analyst not satisfied from suggested queries.
High Volume Data procurement is not able to persist and maintain all data received. So it discarded or ignored data and while sum amount of data are save for some moment of time. As obvious this seems as drawback. The Architecture uses the JIT method to streamline the delivery process of data where it must not ignored or discarded data as happened with High volume Data Acquisition. The Just-in-Time Acquisition is a technique that acquires data from high volume data acquisition, which may be ignored or discarded from High Volume Data Acquisition. Just-in-Time Acquisition is a methodology aimed primarily at reducing time within data acquisition of system as well as response time of end user. High Volume Data acquisition is centered on using efficiency to add value for the end user with JIT.
References
- Vaibhav R Bhedi, Shrinivas P Deshpande and Ujwal A Lanjewar “Data Warehouse Architecture for Financial Institutes to Become Robust Integrated Core Financial System using BUID”. International Journal of Advanced Research in Computer and Communication Engineering Vol. 3, Issue 3, March 2014. ISSN (Online) : 2278-102, ISSN (Print) : 2319-5940, Impact Factor: 1.770.
- Chen, Min, Shiwen Mao, and Yunhao Liu. “Big data: A survey.” Mobile networks and applications 19.2 (2014): 171-209.
- Lenzerini M (2002) Data integration: a theoretical perspective.In: Proceedings of the twenty-first ACM SIGMOD-SIGACT-SIGART symposium on principles of database systems. ACM,pp 233–246
- Oracle Enterprise Transformation Solutions Series, “Big Data & Analytics Reference Architecture”, Online Available: https://www.oracle.com/assets/oracle-wp-big-data-refarch-2019930.pdf
- Oracle Sponsor Decision Management Solution “Real-Time Responses with Big Data:” Online Available: https://www.oracle.com/assets/realtime-responses-big-data-wp-2524527.pdf
- Lyko, Klaus and Nitzschke, Marcus and Ngonga Ngomo, Axel-Cyrille,”Big Data Acquisition”, isbn 978-3-319-21568-6,New Horizons for a Data-Driven Economy: A Roadmap for Usage and Exploitation of Big Data in Europe, doi = {10.1007/978-3-319-21569-3_4,01/2016,Pg 39-61