Little aspects of machine learning techniques have been applied while conducting digital forensics, is deep learning going to be the game changer on adaptive forensics?
Deep Learning uses some machine learning techniques to solve problems through the use of neural networks that simulate human decision-making. This therefore means that, Deep Learning holds the potential to dramatically change the forensic domain in a variety of ways as well as provide solutions to forensic investigators. This may include but not limited to reducing bias in forensic investigations to challenging what evidence is considered admissible in a court of law or any civil hearing.
Besides, Deep Learning which is a subset of Artificial Intelligence, has very distinct use-cases in the domain of forensics, more Specifically in Digital Forensics, and even if many people might argue that it’s not an unrivaled solution, it can help enhance the fight against cyber crimes.
Nickson M Karie Thanks a lot, now for the case of (adaptive forensics) I tend to think, that more often than not, we have relied so much on acquisition and analysis using more traditional techniques but I have not seen many situations where we need to train our processes/models in the direction of ML. It looks like deep learning may solve a myriad of these issues! Do you think cognitive techniques can present a much more realistic problem solving approaches? Let me hear
Victor R Kebande Deep leaning can be that game changer from traditional techniques into models that use Machine Leaning. I will recommend "Kernel-based deep learning for intelligent data analysis" by Wang and Pei (2017) who highlighted that it is possible for a Deep Neural Network to be able to unearth visual patterns through robust learning and also huge volume of data sets. This means that Deep Learning has the ability, when used in Digital Forensics, to unearth relevant evidence from Big Data as and when required by investigators.
Nickson M Karie yeah that sounds interesting, however, what I find to be rather pejorative is the extent or the slow growth or the inadaptability of forensics in AI and Machine learning. Deep learning, has the constructs that can easily be adopted in forensic processes. Is standardizing some of the forensic processes paving way for a fast growth, the generalism in standards can also be a factor because we tend to lack application-specific objectives that can make digital forensics adaptive. So, what do you think of application-specific deep learning architectures that are tailored for specific digital forensic processes. For example, if we take a dull process like reporting from ISO/IEC standard, will it make sense to make some of these concepts be interactive?
Ooh yes Victor R Kebande Standardization is inevitable in this case, especially when it comes to accelerating growth, and adoption of application-specific objectives.
Remember as of now the ISO/IEC standard we have is an Umbrella standard for all the digital forensic processes involved. there lacks a standard for individual or specific process as you have mentioned as an example of a dull process like reporting mentioned in the ISO/IEC.
This means, Currently there exists a lack of standardized procedures designed to help in preparing quality forensic reports for use in court or civil hearing. This, therefore, results in disparities on how forensic reports are prepared and presented to different stakeholders after an investigation process has been conducted.
With respect to Deep Learning, classification algorithms for example have the ability to draw a conclusion from observed values and determine to what category new observations belong during investigations. Allowing Deep Learning algorithms to assist in producing forensic reports can help save time and money.
However, a standard procedure on how to prepare high quality forensic report will do more good than just living it to the individual Investigators to decide on what to put and what not to put in a forensic report.
Nickson M Karie this again reminds me of some of the endemic disparities that sometime earlier you had as a concrete research problem. The other concept that touches on some of the issues you have raised is on the digital forensic report. There is this aspect of presentation of digital forensic reports and the generation of digital forensic reports. If you keenly observe the trade-off, you will discover that reporting is rather hanging on a cliff, being that most or all of those processes are presented as umbrella standards, could dissecting each subprocess be viable or the scientificness will be tampered with. While I agree that we still have disparities in digital forensics, I still believe that contrary to the belief that science relies on the process, it is possible to subdivide make them adaptive and then join them holistically. What do you think about the evaluation approaches that has been extrapolated in most of these forensic models, do you think more is needed?
Victor R Kebande In my opinion i believe, standardizing procedures and specifications for preparing forensic reports need to be developed. This will, for example, help investigators and law enforcement agencies have a standard way to determine, with less effort, the validity, weight and admissibility of any forensic evidence brought before the court
However, standardization of any of the specific procedures should be such that the report produced is comprehensive and admissible for presentation in any court of law or legal proceedings.
Nickson M Karie I totally concur with some of the illustrations that you have put across, hitherto, the acceptability-forensics lingo has always had its way in between the digital forensic experts and the LEA, what sits in between is technology and the legal connotation which blindly crops in as a challenge. The era we have reached is faced by very many issues when it comes to admissibility, because what has, in many years thought to be admissible seems to have changed with the diversification, proliferations and the reintroduction of new technologies. Human has been subjected into very antagonizing forensic situations, that is why currently there is a big limitation on what should be touched and what should not as potential evidence. Not long ago the EU data protection act touched on very sensitive issues now the scope has shifted to the GDPR-EU which is even more restrictive on privacy issues. This again brings me or makes me to put an argument that, is what admissible really admissible. If you followed closely recently in USA, there was a case where easily an error acquitted a number of rape suspects and we have also had wrong convictions, in situations where evidence has been admitted and the jurors gave a judgement. Consequently, previously image examiners relied on similarly flawed methods, they have continued to testify to and defend their exactitude, according to a review of court records and examiners. Do you think standardizing the techniques of achieving admissibility would provide a long lasting solution with the changing technologies? Let me know
Victor R Kebande note this that the forensic domain is a growing and gaining popularity among many professionals. This has then brought about several proposals in terms of investigation process models that can offer direction on how to recognize and preserve potential digital evidence obtained from a crime scene. However, the vast number of existing models and frameworks has added to the complexity of the forensic field. This situation has further created an environment replete with lots and lots disparities in the domain, which need to be resolved. Talk of disparities in evidence Acquisition, transportation, preservation, analysis, Reporting and many others. surely a solution to address these disparities is inevitable and standardization might just be one of the solutions.
Nickson M Karie while I agree that there exist disparities, I vehemently would like get corrected when it comes to standardization. The generalism that comes with standardization might be the reason that we have more of this model. The reason I may be putting this across is because, when do you realize it is the right time to standardize with this evolving technologies. For, example, presently billion of devices are being connected, that means billion entry points for adversaries, no standard, we may end up with a big or many ineffective standards, but something very interesting might be to go back and recap the old forensic science and follow the forensic toxicological approaches. With respect to evidence Acquisition, transportation, preservation, analysis, Reporting, do you think before we go to the standardizing route there are some processes that need to be incorporated in these processes. One more thing does reporting has anything scientific?
Nickson M Karie Do you think the act of mining forensic intelligence could also be something of interest when applying deep learning techniques, could this be realistic, given that machine learning concepts are involved? Let me hear what you think.
Victor R Kebande Remember When investigators recover digital evidence and follow proper scientific ways to process as well as document every process, the outcome of such is a major forensic intelligence source which can be used for criminal investigation and analysis. In this case therefore, deep learning techniques have a potential of being used for mining such forensic intelligence in a way that it can aid other investigators or LEAs. However, as of now i can not fully comment in the realistic point of view but given that machine learning concepts are evolving then this can yes be a realistic phenomenon.
Nickson M Karie very well, I have been thinking about this for a while and I seem to be getting closer and closer to something. Now given what you have said above, extracting of forensic intelligence may require or may need one to identify significant commonalities or aspects that may be able to cut-across different areas that could be employed by investigators. Now, there was was once an issue of removing or putting the common attack channels, scenes,patterns in one centralized database or repository in order to achieve this goal. Now do you think it can be possible for us to apply some statistical modelling approaches in order to for discover some of these abstract behaviours across the cyberspace?
Victor R Kebande commonalities can be used to classify different topology as well as attack vectors. However, an afraid that with the changing nature of Technology more rigorous modelling approaches will be needed in order for investigator to discover some of the abstract behaviors across the cyberspace .