I guess this is a general all time hot topic. well known technique such as LOC, Function Point, and COCOMO models all have been well established. I think there is lot of research to be done on experience based metrics where domain knowledge and skills are required. How do we develop a tool? Also, I recently seen an article exploring FP for new technology projects such as cloud and SOA. This is an interesting article. I think we should form a small working group and we could contact Berry Boehm who is the pioneer in cost estimation methods. FP links http://www.servicetechmag.com/I68/1112-4
I guess this is a general all time hot topic. well known technique such as LOC, Function Point, and COCOMO models all have been well established. I think there is lot of research to be done on experience based metrics where domain knowledge and skills are required. How do we develop a tool? Also, I recently seen an article exploring FP for new technology projects such as cloud and SOA. This is an interesting article. I think we should form a small working group and we could contact Berry Boehm who is the pioneer in cost estimation methods. FP links http://www.servicetechmag.com/I68/1112-4
I have recently made a non systematic review on this topic: https://www.researchgate.net/publication/233759508_Automatic_Control_of_the_Software_Development_Process_with_Regard_to_Risk_Analysis (chapter 3)
Chapter Automatic Control of the Software Development Process with R...
The most used estimation techniques are the ones that were listed before, LOC, FP, User Stories (if you are "agile") but after years of development I think that the best option that you can do is to gather the best engineer in your organization during 3hs in a room, discuss the problem to resolve and apply http://en.wikipedia.org/wiki/Wideband_delphi
Where you divide the room in several teams, each one estimates the project and the rest get some feedback listening each other, after some iterations you will have a good estimation.
It depends on when you want to estimate. If you are looking for early estimations, I advocate that a-priori techniques (such the function points) are the best. In this context, you can use measurement procedures to apply COSMIC FP or IFPUG FP to conceptual models.
If at the time of estimation you have your Use-Cases ready then it seems to be a very good method. A colleague of mine, Miroslaw Ochodek, recently proposed Estimated UCP, i.e. applying Use-Case Points to estimated number of steps or transactions. From the point of view of our Software Development Studio (where use-cases are mandatory) this method looks very attractive.
The Scrum development process uses agile estimation procedures that are radically more accurate and 50 times faster than traditional estimation. Based on research at Rand Corporation in the 1940's, the contemporary variant of the Delphi method has been used in many recent Microsoft projects and compared to their traditional estimates. See http://scrum.jeffsutherland.com/2010/04/story-points-why-are-they-better-than.html for links to the research paper.
However there exits other models specifically tailored for Requirements engineering process, like MARCS (Macaulay - UMIST), and DYNASIS (Williams - LSBU)
I agree with Fernando .. With my experience I think that the best option that you can do is to gather the team members in a board room, discuss the problem to resolve and apply http://en.wikipedia.org/wiki/Wideband_delphi
Where you divide the room in several teams, each one estimates the project and the rest get some feedback listening each other, after some iterations you will have a good estimation.
It is very hard to establish absolute measures. People are much better when estimating relative values and comparisons. Therefore, you need to establish some references before you engage in cost estimates. If your team is using agile processes like Scrum, you might be able to establish some metrics on what has been done in few iterations and then use these metrics to perform relative estimates and extrapolate. Another approach is to try to estimate efforts based on comparisons with some previously done work. For example, if you and your team agree that the new project is about three times bigger effort then some project you accomplished before (please note relative measure) then you can use SLOC, FP, COCOMO and other methods to analyze your existing, previously done project, and project three times these measures. You can nicely combine these techniques to get some meaningful numbers. For example, size your previous project in SLOC (i.e. 50,000 lines of Java), then "predict" your new project to be equivalent to an effort needed for 150,000 lines of Java, and then use that metrics in COCOMO tools to estimate the efforts and total cost. This approach can work even if your new project will not be Java simply because it gets converted into effort (i.e. man-days). COCOMO programs allow you for some other parameters such as what amount of code is expected to be reused, how skillful your team is etc. However, please do not forget that this is estimation, not exact math. Hope this helps;
I agree with all the suggestions and discussions. The question was what are the "popular" cost estimation techniques. Yes, the literature includes COCOMO, FP, Use Case Points, etc. I think and believe the "popular" technique, you will find, is not very formal. When i was working in the industry, we asked a lot of questions such as the type asked with COCOMO technique--- have we done this before? -- what are the unknown/unproven areas?--- what is our histroy in projects similar to this one? --- what is the approximate size of this ( in loc, fp, usecases, etc.)? what type of customers are we dealing with? ---etc. Then we ask the people to provide a high level decomposition of the system architecture (early solution) and provide a quick size estimate (in loc,in fp, in use cases points, in number of db tables, etc.) . Then we "guess" at the expected productivity rates from past experiences. After dividing the size by productivity rates, we get the initial size estimate. Then we look at who we intend to assign to the tasks and fold in a "fudge factor" based on what we know about the people. Many managers in small to medium sized organizations carry this out in their heads and come up with their cost estimates. This I believe is the "popular" technique. I have also written a book on project management --- Managing Systems and IT Projects (published by Jones and Bartlett). You may find it interesting to take a look at that.
One thing you should also take into consideration is for what software domain you want to do the software cost estimation. Most of the answers already given apply very well for data-centric software, such as administrative software. If you want to do software cost estimation in domains like real-time embedded software, SOA services or mobile apps, you should take a look at the COSMIC method.
Recently Renault has gone public at the ERTS conference how they manage the automotive embedded software development cost & productivity with the automation of a Functional Size Measurement Method (COSMIC). You can download the paper from http://www.erts2014.org/Site/0R4UXE94/Fichier/erts2014_7D3.pdf.
Renault will also speak about this on the IWSM Mensura conference in October in Rotterdam. This conference is a big platform for COSMIC research and experience reports. You can access the papers presented there from http://www.iwsm-mensura.org/conference-locations.
Apart from others answers, I suggest the poker planning used in SCRUM, and based on my experience, the more accurate estimation is done based in our own performance data (like the PSP personal software process state, using linear regressions and the PROBE method)
As you know, there are lots of SCE techniques available but in my idea it must be followed using some optimization algorithms to do a more accurate estimation. some meta-heuristic algorithms like Imperialism Competition Algorithm , PSO and others
although some new techniques are under develop like using SOA .