Having spent time in both:
I can personally attest to the difference between the way I want to be taught code and the way code tends to be taught.
Academia suffers from a few antipatterns.
The biggest, I think, is the notion of plagiarism. In an attempt to keep students from cheating by stealing code from other students or from the internet, each student is supposed to write all of their code from scratch every time. This flies in the face of what coding is like in practice. Reinventing the wheel when writing code is B.A.D., bad. It can have horrific consequences when security and cryptography are concerned. It is also a total waste of time. In the end the student is forced to unlearn this habit (hopefully) before entering the workplace. Open source is a real thing. Odds are very good that not only has someone else already written that code, it has been done far better than the employee could ever be expected to do on their own.
The tendency to focus on the abstract over the practical. Don't get me wrong, the abstract is very important (and awesome). Focusing too much on theory is a problem when employers end up having to teach people how to actually produce code after graduation. Most people have the impression that the point of going to school is to learn the skills required to perform in a job/career. This opinion is shared with a friend of mine who graduated with a BS in computer science. Upon graduating she felt inadequate and grew afraid to even apply to jobs. Wherever she went she was told she needed skills she didn't have. In general I see the successful computer science graduates leave school leaning heavily on the skills that they have taught themselves over the years in the process of coding things for themselves.
Assignments are not like what real problems are like. Writing code is one giant endless collection of puzzles. Solving the puzzles (well) is an art. Learning this art from school assignments is difficult. Not always - some teachers have a knack for creating assignments that aim to do so. Thank your lucky stars if your teacher is this way.
Often there is a lack of focus on teaching best practices. For instance: naming convention; modularity of code; neatness; organization; text editors; languages; libraries; techniques; idioms; etc. The classroom is an excellent opportunity to spread best practices. Instructors have so much power to shape the software landscape from the ground up. People tend to use what they know, so whatever people learn en masse in school picks up enormous momentum. Equipped with best practices, coders will tend less to shove round pegs into square holes.
Topic focus tends not to reflect the needs of the industry. This is obviously a tall order and I'm not sure how to actually solve this issue. There is a misconception that the tech industry is constantly changing. There are (many) concepts which haven't changed since, get this, the 1950's. I'm serious. Something incredibly important that could be taken advantage of are all of the remarkable advancements which have been around for decades that keep being reinvented (poorly) over and over again and paraded around like they are something new. Academia has a chance to truly shine here. Collecting and spreading the knowledge of these concepts and techniques would provide value unmatched by any other source. In some respects the landscape really does change rapidly (javascript frameworks ...), but much of it is an illusion. Shifts don't usually start and stop suddenly; there is a gradual lead-in and lead-out. Teaching people something which is on its way out is still valuable as long as technique is the focus.
I had one stellar professor who completely understood all of this. His assignments were remarkable in how specific they were in concept and how free they we were in execution. No two projects ended up looking alike, yet they all focused the same computer science principle. Wow! (The professor was Don Chamberlin).
In the quote unquote "Real World", much of learning how to code is reading through documentation in order to figure out what can be done at all. Ideally also learning what ways things can be done. Documentation is almost always in alphabetical order. Alphabetical order is great to use as a reference once the coder already knows what can be done. However it completely randomizes the order of both the usefulness and the frequency of use. For those of you who write references, please take this into serious consideration.
Mentorship is a good temporary solution. It may also be a long term solution but the underlying issues need to be solved as well. A pay-it-forward mentality will do wonders for the computing industry. So much of writing code is the ability to foresee pitfalls and navigate around them. This is a skill that can be learned in two ways:
A good long term solution is to build a voluntary, free, global certification program. If coders have taught themselves everything they need to know on their own, they would be able to take the test and prove their worth. The business model here would be to charge people who want to hire a programmer and verify that the programmer is who they say they are and have the necessary skills. A good example of this technique is hired.com, but their tests are no good.