If there is one thing in abundance in software development it is opinions. From which technology to use (languages, frameworks, libraries, etc.), to architectural design (everyone’s an architect!), to the user interface layout, it is never hard to find opinions.
One thing that is rare however, is a firm grasp on reality when it comes to these opinions. It is common for developers to have difficulty conceptualising how long it will take to implement a requirement. As a result they will often “aim too high”, setting goals that are surely not achievable within the given time constraints. The end result will be that even if the requirement is implemented, it will usually be over time, over budget, and of poor quality due to the pressure of looming deadlines and the abandonment of adequate testing (which is, without fail, always the first casualty).
Also common amongst software developers is the resume-friendly implementation. How many times have you seen an implementation use AJAX where it is completely unnecessary? I know that I have seen a few. Sure a technology might be cool, and developers do need to keep their skills up to date, but is it the right tool for the job? Will it realistically integrate with existing infrastructure with the given time-frames? Using the wrong tools to implement production systems is ultimately irresponsible, as it can eventually lead to lower quality software and higher maintenance costs.
Typically you might expect a greater level of responsibility from software architects, however this is not always the case. Unlike developers, architects need to be aware of the entire system infrastructure, and as a result may make safer decisions when it comes to technology choices. However when it comes to the system design, more often than not the architecture will include needlessly complicated configuration options, allowing for the greatest number of scenarios possible. Maybe the architect’s model is sufficiently complex to allow it to be used in many different configurations, but how many configurations do you actually need? If the software is designed properly, shouldn’t you be able to add support for additional configurations at a later time as required? This kind of unnecessary complication is also often a major factor in implementation issues.
One of the biggest failings of the modern software development process is the disregard for Human/Computer Interaction (HCI) experts. Once upon a time people with a good knowledge of interface design were employed to do just that. More recently the system users feel they are qualified to do it themselves (how hard could it be?), leading to glaring inconsistencies and ultimate confusion for the end user (“this isn’t what we wanted” - “but you designed it yourself!”). Or in the slightly less disastrous situation the developers are given free reign to design, and whilst the user interface design may be fine for developers, will it be easy to understand for the end users?
Having access to a diverse range of opinions is ultimately a good thing, but you also need people that can adequately filter these opinions in order to recognise the approaches that are in fact realistic and will achieve the desired goals. Such an ability takes a good deal of self-discipline and objectivity, which is unfortunately not something that is all too common in Information Technology circles. :)