Big Data - Conceptual Modeling to the Rescue

Big data is characterized by volume, variety, velocity, and veracity. We should expect conceptual modeling to provide some answers since its historical perspective has always been about structuring information--making its volume searchable, harnessing its variety uniformly, mitigating its velocity with automation, and checking its veracity with application constraints. We provide perspectives about how conceptual modeling can "come to the rescue" for many big-data applications by handling volume and velocity with automation, by inter-conceptual-model transformations for mitigating variety, and by conceptualized constraint checking for increasing veracity.