Big data requires a good understanding of the voluminous information already available and the ability to sieve what you need from such. In the example of medical data, (you can imagine how big this is), you may need to have statistical knowledge for better implementation. Big Data is multifaceted
I think one big issue is to transform such non-structured into (semi-)structured data. The implementation, the storage, the presentation and the interpretation of such a massive amount of data can be very challenging and requires subject-specific expertise.
Data is growing very rapidly in today's world and mostly IR systems have two merge or integrate the data(like XML, text files, web applications and programming interfaces etc) retrieved from different sources of deep and surface web. Mostly faced challenges by integrating are following.
1. Sources of data is an issue because it is difficult to find common operation methodologies between two different domains .
2. Data inconsistency will increase because of integrating heterogeneous sources . unstructured data sources will increase this issue.
3. Query Optimization at each level of data integration is another problem to be considered.
4. insufficient resources is another problem in InterOperability. like skilled professional, financial, implementation, and required software's are issues to be considered.
5. Scale ability of data is also an issues because of huge data sets while integrating them.
There some others which have effect on this topic like lack of support systems and load during data transformation etc.