The answer these days depends upon basically the following: how fast do you need your images and how little post processing are you willing to do? If your answers are instantaneous and none, then JPEG is a perfectly fine choice.
Most readers of this site, however, are looking for getting the most they can from their cameras. I like to teach photographers to turn shooting into the capture of optimal data. JPEG is certainly not optimal data. In a 12mp image you should have 4000 x 3000 x 14 bits worth of data, which is about 20MBs of data. Typical JPEG settings these days net you about 6MBs of data, so the compression is generating only one-third the data. You're not going to get more data back.
There's also no compression in (most) raw data, and a good raw converter actually converts our 20MB source file to 72MBs of actual data (12mp x 16 bit RGB). In theory, we've now got 10x+ worth of data compared to the JPEG, and there are no JPEG blocks or JPEG mosquitoes (types of compression artifacts) in it. That's more what I would call optimal data.
The usual complaint I hear at this point "but it takes too long and is too difficult to deal with raw images." That might have been true 10 years ago, but today I think it's a fallacious argument. First, we have Aperture and Lightroom, which effectively take the drudgery out of raw conversion and add a huge dose of image management, which you really should be using (you aren't saving thousands of DSC_####.JPGs any more, are you?).
Spend a little time creating presets to your preferences, and then just ingest and enjoy. If you need JPEGs for Web, email, or whatever, just export out of these programs. It's simple, easy, and direct. But even more interesting is that Nikon stores a JPEG basic Large image in every raw file. Recent Nikon DSLRs are so good that JPEG basic is darned good at all but the most extreme ISO values. So, if you need a JPEG instantly, just extract it. (Pity that Nikon doesn't make it easy, but they've been lost on the subject of workflow pretty much from the beginning.)
What? How do you do that? I've been telling my book customers for years, but one simple solution is Instant JPEG From Raw.
So the real question should be: why the heck are you still shooting JPEG? You've already got one. Plus you can get more pretty quickly and easily by just using the right tools at your computer. Thing is, you can't do it the other way round. If you only shoot JPEG, you'll never get the other 14 to 66MBs of data back, ever. Never.
Okay, let me come at this a different way. Let's say that you're Disney. You're readying your latest family feature film. Today, the best broadcast TV output is 1080i/60. The best consumer output is Blu-Ray at 1080P/30 (actually, still 1080i/60). Typical digital theatre output is 2k+, though many theaters are now 4k (7mp to 12mp depending upon which of the four definitions you're using). There are people talking about 8k and even Ultra HD (16x what your HD TV can do). So here's the question: do you shoot in the lowest quality format (1080i/60) or do you use the best format available to you? Well, since you're Disney, the answer is easy: you use the best format available. Why? Because you expect to still be using that film for generations to come, and you want it to be able to be shown in its best form.
So what about you? Do you want your images to be able to be seen by future generations at their best? If your answer is yes, then you shoot raw. If your answer is no, what the heck did you read this far for?