All of the above answers should prove useful. I would be remiss if I did not suggest that two weeks may not be enough time to produce trustworthy qualitative analysis. Prolonged engagement with the data during the analysis stage is one hallmark of trustworthiness, and I do not think that two weeks would be considered sufficient engagement. Now, if you are conducting a simple thin content analysis involving frequency counts of words or phrases, two weeks for analysis may be enough time. But, if you are looking to produce rich, thick high quality analysis, I think that two weeks is unrealistic.
I know this advice may appear odd and technologically not savvy but, sometimes, working the old-fashioned way (i.e., making your own written notes, scribbles, cards, stickers, highlights, drawings, maps, etc.) could be the most effective -- time-wise and effort-wise -- method.
If the data is not huge and not multi-modal this could be the best way to analyse.
(Sometimes we feel --or pressured?- we must use a software when, in fact, we would do just as well without it).
All of the qualitative data analysis programs do very similar things, in terms of marking your data to code it and then searching through your codes. They do differ in their "look and feel" so you may find one program more comfortable for you. In particular, some people find NVivo's emphasis on "nodes" (rather than codes) to be counter-intuitive.
Fortunately, all of the major programs, such as MAXQDA, Dedoose or ATLS.ti have web sites with detailed video tutorials. That will give you an idea of both how the programs are set up and how useful the instructional resources are.
Given your need for a quick decision, I personally would recommend MAXQDA.
I agree with Larissa above. This is through personal experience after trying to code my data on NVivo for my PhD project and not progressing much as I had expected. The manual way, along with all the notes, tables, highlights, helped me untangle the mess I was in. I used basic MS word tables to categorise initial key statements/phrases for each interview. Then a lot of handwritten notes and scribbles of cutting, re-writing etc. went on to refine this initial data. It is with this basic form that I progressed to make up themes and build my final story.
I agree with Larissa and Ruwangi that using 'manual' methods, e.g. notes, tables, comments etc. can be very effective. For example, you could use the coloured highlighting feature in MS Word to 'code' sections of text (with different colours corresponding to different codes), and use comment boxes to explain your thinking or to reflect on the text in more detail. I have used this method in the past and it can work very well, particularly in studies with a relatively small sample.
Perhaps the main purpose of specialist qual analysis software like NVivo is to make systematic analysis more straightforward and easier for the researcher. If you don't feel that it achieves this goal for you, you shouldn't feel pressured into using it.
With regard to the debate about analyzing the data "by hand" versus using software, almost all the key features of software in this area are based on manual techniques -- such as marking data to code it. Where software has major advantages for coding are in dealing with large data sets or complex coding procedures. In addition, it is more efficient for searching your coded data, which again matters most when you have a large or complex data set.
So, if you have a reasonably small and straightforward set of data (e.g., 5 or 6 individual interviews), then coding the data by hand makes sense.
I use Atlas.ti since I don't know any else software but can say, it is absolutely enough to analyse qualitative data. Actually, I have 104 documents which will be analysed during this week.
Unfortunately, I needed several hours to find each of Atlas. ti's possibilities but finally data can be transferred into SPSS so I think two weeks really enough for this mission.
I used SPSS. You can turn most of your qualitative data to another type of variable if it makes sense (if it does not, SPSS is not your best choice :))
If you want to process in a decision tree structure, DEXi is a good bet.
I think NVivo has developed considerably as a tool/software for facilitating qualitative data in terms of organizing, storing, administration, searching and ordering data... However, I think it is worth bearing in mind that it is not the software/computer that interprets and analyses the data/text but the person/researcher/analyst.
I think that the simplest way to analyse qualitative data is by hand, but as professor Morgan states, it is best applied if you do not have a lots of data. When aanalysing by hand the data set needs to be able to have an overview of. If analysing by hand you do not have to spend time on trying to figure out how to handle a tool you have not used and in that way it is timesaving.
However, I do not think that there is any really quick way to perform an anlaysis lof qualitative data, at least not in a way upholding a high quality.
All of the above answers should prove useful. I would be remiss if I did not suggest that two weeks may not be enough time to produce trustworthy qualitative analysis. Prolonged engagement with the data during the analysis stage is one hallmark of trustworthiness, and I do not think that two weeks would be considered sufficient engagement. Now, if you are conducting a simple thin content analysis involving frequency counts of words or phrases, two weeks for analysis may be enough time. But, if you are looking to produce rich, thick high quality analysis, I think that two weeks is unrealistic.
I prefer using software, including NVIVO, to organize the qualitative data (not analyze it, as often misunderstood). When using software to menage/organize the data, you have the added advantage of the process resulting in potentially facilitating analysis of entangled concepts/themes. In relation to manual thematic analysis, I have found use of multicolored highlighters useful in the process as well as additional analysis by independent persons (i.e. who have not collected the data), during the analysis, as a means of confirming themes and minimizing bias...