Ask a linguist what's the difference, and she'll probably say it's a matter of syntax. In a human language, the words have to come in a certain order. 'John hit Bill' is different from 'Bill hit John'. Or if I say "The fluffy bunny exploded," you have an automatic understanding that 'fluffy' and 'bunny' have a special relation to each other. They've grouped into a structural unit. Non-human communication — and even apes who are taught human language — never shows any syntax of this type.
(If you're new to this area, here's a really good article about it. All the major players weigh in, and it's very readable.)
But beyond the 'language or not' issue, there's an even more interesting discussion. Namely, if it's not language, what is it?
Linguists break into two camps: There are the linguists who say that human language is something qualitatively different from animal communication. This would be Chomsky et al. They'd say there's a Language Acquisition Device in the human brain (as yet undiscovered) that no other animal has, and though they may be intelligent and communicate, they'll never 'graduate' to real language use. We humans have the principles of syntax — that all human languages follow — hard-wired natively into our human brains.
Then there's the other team that say human language is just more complex than animal communication. Maybe there's a continuum where animal communication can be more or less language-y, and all animals fall short of real language behaviour. Maybe if animals were doing something different, they'd slide up closer to language. Maybe syntax is something a very smart animal can do, and if other animals were smarter, they'd do syntax too. Maybe people use syntax to keep everything straight because talking is so demanding. And so on.
This view is interesting because if we suppose there's a scale of languageness, we can see how far up the scale animals can go. Which takes us to some interesting work from a while back.
Nonhuman primates are unable to grasp a fundamental grammatical component used in all human languages, researchers at Harvard University and the University of St. Andrews in Scotland reported recently in the journal Science. Their work provides the clearest example to date of a cognitive bottleneck during the evolution of human language, suggesting a sharp limit to animals' capacity to generate open-ended communication and possible restrictions on other domains of thought.The experiment was this: they played sounds of a man and a woman speaking nonsense syllables to groups of cotton-topped tamarinds. The actual syllables weren't important; what mattered was the male/female order of the voices.
One group heard patterns of male and female voices that could be generated by a regular grammar, the simplest kind of pattern generator you can have and still call it syntax. This generates very simple sequences like MFMFMF. Once the monkeys got used to the pattern, the experimenters broke the rules by switching up the sequences. Sure enough, the monkeys noticed; they would turn their heads to the loudspeaker as if to say, "What the?"
But other monkeys got patterns generated by a context-free grammar, one step up in complexity. Here the monkeys would hear patterns like MF, or FFMM, or MMMFFF. When these patterns were broken, the monkeys didn't even notice, which indicates that these grammars were too complex for them.
Most human languages are a step up even from that, following rules allowed by 'context-sensitive' grammars. So, conceptually, the syntax of human language is way beyond the capabilities of even these clever types.
Some animal researchers claim that their African Gray Parrots are understanding them and generating real English sentences. I'd love to see what kind of patterns these birds are capable of. Seems this kind of test could help sort out the difference between simple parroting and real language use.