January 28th, 2010
My start into the fourth day of OOP was hard core techie stuff again: “REST with Java” by Stefan Tilkov. Great overview what it is about, what are its implications on architecture, available tools & libraries etc. He managed to cover a lot of ground in his talk but even 90 min was too short. Stefan used to be a dyed-in-the-wool web service advocate. However, experience has taught him that web services have some serious issues (e.g. protocol independence) and REST is way better to do SOA. Or maybe as a book author you always have to ride the bow wave… He was very convincing, though. Still, web services are not obsolete all of a sudden. Existing interfaces are easier to expose as a web service but for new interfaces a REST approach should be considered. Last but not least, JAX-RS (the Java API for REST) is a nice example of the power of annotations in Java (much nicer than bean validation annotations).
The second talk was about “Model-driven renovation of a legacy system“. The German flight control has a simulation system for training and evaluation purposes. It is built in C++ and Java, has been devloped over the past 15 years and its architecture has deteriorated to the point that it is hardly maintainable. Using the openArchitectureWare toolset (now part of the Eclipse Modeling Project) they were able to factorize all entity definitions and just generate (semantically equivalent) C++ or Java code. Interestingly enough, they first used MagicDraw to draw UML diagrams which then were fed to OpenArchitectureWare. They soon had to realize that drawing UML was the wrong way and resorted to a DSL (which is the forte of OpenArchitectureWare anyway). The domain logic is still handcoded but clean entity definitions paved the way for substantially improving the architecture.
Robert C. Martin (also known as Uncle Bob) who is one of the most prominent figures in OO and Agile presented two talks today. One was about “The Polyglot Craftsman” and the other was titled “S.O.L.I.D. – fifteen years later“. Listening to Uncle Bob is like watching American TV, very funny, entertaining, but little information. The only obvious difference was the fact, that American TV is more colorful and animated than Uncle Bob (although he keeps walking on the stage during his entire presentation): for the first talk he just used a single slide with the title of the talk. Furthermore, he seems to be an astronomy geek because he started both talks with some astronomy topic (for at least 10 min each) which had hardly anything to do with the topic he was supposed to talk about.
The conference concluded with two more presentations. The first one was a vendor talk by Adobe: “RIA: Productivity by Design“. Awful presentation, confusing and badly prepared demos, hardly scratching the surface of the topic. Quite some people left well before the end of the presentation. The last presentation I attended was about Pattern Testing. The topic was interesting, but I didn’t like the presentation style and he could have told the most essential bits and pieces in less than half of the time he spent. The last day of the conference was not as good as the previous three days.
Overall, the OOP was well beyond my expectations, though: broad range of interesting topics, mostly excellent speakers, perfect venue with great infrastructure and superb food. The only thing to criticize (apart from my rare bad luck in picking presentations) was the lack of Internet connectivity which is a bit of a shame for an IT conference. WLAN was available but at 30 Euros for 8 hours (they must be kidding).
January 27th, 2010
What I really like about the OOP (never been to this conference before) is the interesting medley of hard core technical topics and soft topics like develpment processes. Having had my share of soft topics yesterday I decided to look a little bit more into hard core techie stuff. The first session was presented by Jürgen Höller, Spring’s principle engineer. He talked about “The New Features of Spring 3.0“. One of the highlights is that Spring 3.0 now requires Java 5 at least. This gave them the opportunity to use generics and annotations. Generics can be a beast and they are a viral feature but I certainly like the enhanced expressiveness of the code (and getting more type safety as an additional bonus). Spring 3.0 also brings along nice REST support and component stereotypes. One can easily define a new stereotype (as an annotation) by declaring it with all the properties (as annotations) needed. A nice way to factorize your annotations for a component type. They also embraced JSR303 for bean validation. I am not a big fan of bean validation. You start out annotating your bean with simple constraints. However, validation is rarely simple and pretty soon you end up with way too many annotations for each property. Then you start to implement the more complex validations (i.e. the ones that are hard or impossible to define descriptively) in a validation class. Now you have two classes defining a bean’s validation. Maintenance will certainly be a pleasure for this scenario, unless you migrated all validations into the validation class and get rid of the validation annotations. I discussed this with Jürgen after the session and he is also not happy about this. I would recommend to forgo declarative validation and put all validation in a class of its own from the beginning.
The second presentation was given by Ursula Meseberg who talked about her experience with merging usability engineering into a Scrum-based development process. User interface design and usability engineering have some up-front and long-running tasks which is not compatible with the narrow-scoped and short-term tasks in Scrum. After some trial and error they eventually succeeded by strictly separating view and view model and bringing a user interaction designer into the development team. The user interaction designer only cares about view and view model as far as user interaction is concerned. All graphical and visual design aspects are applied independent of the development cycle by a graphics designer.
The afternoon keynote was about “The neighbour’s garden – where architects learn” by Gernot Starke. Very entertaining and still good food for thought. His point was that it is very hard for architects to learn about architectures. There is vast literature available but an architect should also have a look at successful real world architectures. This is where the problem starts. It’s either well documented architectures like the one from Flickr (but how many architects have something to build which resembles Flickr?) or it’s a code dump (aka open source) where it is very hard to find higher-level structures which make up the architecture. Moreover, the most important piece of information, namely why something was built this way and not some other way is usually missing. Most architectures are never disclosed anyway since they are considered a competitive advantage by most companies. I also liked his statements about how software architects can improve. Learning a new language such as Scala or Haskell practically adds nothing to an architect’s qualifications. Improving the communication skills can make a vast difference, though. And successful software architects shy away from the cool stuff (i.e. latest hype) in order to minimize risk.
For the last session I had intended to learn something about application integration with REST. However, while preparing for my talk yesterday I sat next to Gregor Hohpe (software architect with Google) in the speakers lounge. He made quite some interesting comments on various topics which convinced me to attend his talk on “Distributed Computing the Google Way” instead. The most rewarding information I got from this talk was not how the Google File System (or Big Table, Map/Reduce, Sawzall) work but what the design decisions were (as Gernot Starke demanded) and what works with these technologies resp. what doesn’t. Sometimes you can even cheat a little bit to make it run in unexpected scenarios. You can find the slides here.
Bottom line: like yesterday there was not a single disappointing talk, kudos to conference organizers (and lucky me for picking the right sessions).
January 26th, 2010
The second day of the OOP was really exciting. First I attended quite some sessions and all of them proved more than worthwhile (on average I pick one talk a day which turns out disappointing) . Second, I gave my talk on “Renovation instead of a Wrecking Ball” which was well attended and I got lots of questions at the end of the talk.
My favourite talk today was “Top 10 Software Architecture Mistakes” by Eoin Woods. He was very british, funny, entertaining but also informative. His selection of mistakes is quite arbitrary as he admitted, but nevertheless every software architect should be aware of them. At least I have come across most of his mistakes in my career and I will remember two of his quotes: “If you think education is expensive try ignorance” and “The first disaster recovery test is usually a disaster”. You can find a previous version of his slides here.
Next up was a vendor’s talk given by Sue McKinney (IBM Software Group): „Backing into Agile Leadership”. Based on my experience of yesterday’s vendor talk by Intel I was quite reluctant to attend this presentation. However, it was far away from a sales show. Sue told how IBM is trying to introduce agile software development in IBM’s software group with about 8000 developers. One of the top challenges is to establish new values (foremost trust and responsibility rather than command and control) with managers and developers. This talk was nicely complemented by a presentation by Matthias Ziegler (Microsoft) on “How Microsoft managed the transition from a traditional waterfall model to agile development“. It’s not only agile processes but quality awareness as well which play a much larger role at Microsoft nowadays. Quality has even a higher priority than sticking to a release date or feature completeness. If both IBM and Microsoft align their development along agile processes then the agile community has really come a long way.
In between I attended Markus Andrezak’s presentation on “Kanban for large scale Off-Shored Product Maintenance at mobile.de“. Quite impressive to see how they reduced both lead and cycle time of the maintenance tasks, even if you take into account that their previous development process was, to put it mildly, rather chaotic and suboptimal.
The day was concluded with a late night panel discussion hosted by Nicolai Josuttis. Very entertaining how they bashed Google, the German government or Nicolai for not having a Twitter account. 🙂
January 25th, 2010
Michael Stal: Von den Anforderungen zur Softwarearchitektur
This week the OOP conference is taking place in Munich. The first day is tutorials only and I attended Michael Stal’s full day tutorial on “Von den Anforderungen zur Softwarearchitektur”. When I got the handouts I was quite intimidated: around 240 slides is a lot of ground to cover even in 6 hours. Overall, the tutorial was quite good. He not only focused on basic software architecture principles but also, among others, how requirements and software architecture get together, how to deal with architecture in the software lifecycle or how to perform an architecture review. In addition, the slides contain a number of checklists and templates which deemed me quite helpful. On the other hand, I sometimes was not sure what kind of audience the presenter had in mind. Occasionally, he went at great lengths in explainings basics such as the observer pattern. Even junior architects should be familiar with design patterns (at least the usual suspects from the gang of four book).
Ralph de Wargny: Die Multicore Zukunft – Ein Eldorado für Software-Entwickler?
At the end of the first day Ralph de Wargny who is an Intel business development manager gave a keynote speech. Unfortunately, this was an almost 100% Intel sales presentation and not very convincing. Intel ended the Gigahertz race a few years ago and since then they have been preaching the virtues of multicore. They now intend to prove Moore’s law by doubling the number of cores every 18 months. However, software developers are hardly capable to make use of this processing power. Parallel programming has been and still is hard. Multicores make a lot of sense in a server environment especially when virtualizing operating systems and applications. Optimizing common desktop applications for multicores is hardly worth the effort unless you can achieve this with tools rather than doing it manually. Ironically, he showcased Crysis, a very violent game, as the first of its kind which makes full use of the four cores in the new i7 to achieve more realism. At least, multicore enables you to kill more virtual opponents in less time. I understand the need of the conference organizers to provide advertisment slots to the conference sponsors. But it should not be a pure sales show.
January 25th, 2010
We are happy to announce the second RIA forum which will take place in Darmstadt (close to Frankfurt), 23rd of April 2010! This time, with Canoo Engineering AG as premium sponsor, four well known speakers will talk about the advantages and disadvantages of four different ways to create effective user interfaces (especially in business contexts).
Instead of giving details here I recommend to visit the forum page directly: http://www.riaforum.com (in German). Please be aware that we can only provide entrance to a limited audience, so if you want to join, make sure you sign up quickly.