Implicit requirements are the hidden or assumed requirements that a system is expected to fulfill though not explicitly elicited during requirements gathering.
In my experience, implicit requirements can be divided in 2 types:
- "simple": they are considered obvious by the client, so that she does not feel that it needs to be explicated. They are often obvious only to the client, and it takes a requirements engineer that is experienced with the client to detect and make them explicit. Rephrasing the requirements and use cases are also useful techniques.
- "complex": they result from exceptional combinations, so that the client did not think about it. They are easier to find, by exploring systematically a model of the requirements, but it takes more time for the client to arrive at a clear statement of these requirements.
Implicit requirements can be elicited through a quite large set of techniques (from checklists and guided interviews to formal analysis). I'm not sure what you mean by "key features that characterize" but here is my opinion about missing requirements/assumption/properties identification.
For a good overview of available techniques about requirements elicitation, I would recommend reading [Lam09], especially the Chapter 2. It "briefly" describes techniques such as background study, data collection, questionnaires, grids and card-sorting techniques, storyboarding, scenario-driven strategies, mockups and prototypes, interviews, observations, group sessions, etc. I think all these techniques helps in understanding the domain, and eliciting requirements.
For model-driven requirements engineering techniques, there is also a set of systematic techniques. A common technique for identifying missing requirements and assumptions about the software system is risk analysis. Risk can be defined as an uncertain factor whose occurrence impacts the satisfaction of high-level objectives. Risk analysis consists in a three-phase cycle: identify, assess and control. If your focus is on identifying missing domain properties, you are probably mainly interested in the first phase: identify. By identifying causes that falsify your software system, you identify (and make explicit) such missing elements.
In KAOS methodology [Lam09], risks (known as obstacle, a goal-oriented form of risk) can be identified, among other, through (a) formal regression from requirements and assumptions through domain properties - for example, a mobilized ambulance reach the incident within ten minutes, but it might be the case that it last longer, identifying missing assumption that ambulance can be lost - [Lam00] (b) the use of patterns - these patterns encodes known / standard tactics for falsifying the requirements/assumptions - [Lam00] (c) a combination of machine learning and model checking technique [Alr12]. Obstacle analysis has been successfully applied to various application domains [Lut07,Dar07].
In CORAS methodology [Lun11], risks are identified by the use of threat diagrams. Identification is driven by a systematic approach where threats and unwanted incidents are first identified, then scenarios and vulnerabilities.
Analyzing incidents and accidents might also reveal a lot of underlying, implicit, assumptions. Such analysis is strongly connected to the above risk-driven requirements engineering. In this area, I would recommend reading [Lev95] and [Lev11] .
Up to my knowledge, risk analysis is a very effective way to discover new, implicit, domain properties, missing requirements, or missing assumptions.
[Lam09] A. van Lamsweerde, Requirements Engineering: From System Goals to UML Models to Software Specifications, Wiley, January 2009.
[Lam00] A. van Lamsweerde, E. Letier, Handling Obstacles in Goal-Oriented Requirements Engineering, IEEE Transactions on Software Engineering, Special Issue on Exception Handling, Vol. 26 No. 10, October 2000, 978-1005.
[Alr12] D. Alrajeh, J. Kramer, A. van Lamsweerde, A. Russo and S. Uchitel, "Generating Obstacle Conditions for Requirements Completeness", Proc. ICSE'2012: 34th Intl. Conf. on Software Engineering, Zürich, May 2012.
[Lut07] R. Lutz, A. Patterson-Hine, S. Nelson, C.R. Frost, D. Tal and R. Harris, “Using Obstacle Analysis to Identify Contingency Requirements on an Unpiloted Aerial Vehicle”, Requirements Engineering Journal 12(1), 2007, 41-54.
[Dar07] R. Darimont and M. Lemoine, “Security Requirements for Civil Aviation with UML and Goal Orientation”, Proc. REFSQ’07 – International Working Conference on Foundations for Software Quality, Trondheim (Norway), LNCS 4542, Springer-Verlag, 2007.
[Lun11] M.S. Lund, B. Solhaug and K. Stølen, Model-Driven Risk Analysis: the CORAS approach. Springer-Verlag, 2011.
[Lev95] N.G. Leveson, Safeware: System Safety and Computers. Addison-Wesley, 1995.
[Lev11] N. G. Leveson, Engineering a safer world. MIT Press, 2011.