Skip to main content

Replacing a repository

They say you should blog about something you like, stick to the subject, be informal, and write something useful. Add subjective knowledge to the community.

My thoughts on the field of content management haven't really evolved much since I began working one month ago, but I have encountered a portal/CMS system which I sooner or later will have to use and develop with, I guess. Now the sorry thing about this portal is that I can't utilize my experience with the Java content repository (JCR) standard JSR-170. The portal vendor *could* implement support for it, but as I guess the internal datastructure is, this would lead to an entire replacement of the existing repository, throwing some years of development and use out of the window.

My gut feeling tells me this would not be worth it at this point. But for the sake of argument, let's wheigh the implications of replacing a homegrown repository with for instance Jackrabbit.

Negative implications

Implementation. The datastructure needs to be re-written. The persistence mechanisms might not be loosely bundled enough, or interfaced in the same way as the JCR, so the old repository can't just be unplugged and replaced with Jackrabbit.

Existing content. There are probably thousands of pages of existing content using the old data structure. Literally this is an implication you get to feel when migrating from one CMS to another, but it also applies when changing the content repository alone. Imagine all the customers that will have to migrate their content when they wish to upgrade to the latest version of the portal.. Ugh.

Positive implications

Developer friendly. I know JSR-170, so I already know how to use the repository. I don't need to spend X weeks learning the repository domain model and its API.

It's a standard. The repo becomes compliant with the standard. Content can be imported and exported between other JCRs.

Free stuff. Jackrabbit has implemented alotta nice features the spec provides, versioning, security, search, transaction management and access control to name a few.

Any more implications? To me, the migration one stands out as the biggest blocker. And as long as the customers don't have a need for stuff like that, JCR-compliance remains a (rather weak) sellng point.

Haven't got any more time now, will try to write back with the "conclusion" later on. In a couple o' weeks I will actually get to meet the lead guy of this portal's development, and I have until then to come up with the arguments to convince him. But first I have to convince myself :)

Comments

Popular posts from this blog

Open source CMS evaluations

I have now seen three more or less serious open source CMS reviews. First guy to hit the field was Matt Raible ( 1 2 3 4 ), ending up with Drupal , Joomla , Magnolia , OpenCms and MeshCMS being runner-ups. Then there is OpenAdvantage that tries out a handful ( Drupal , Exponent CMS , Lenya , Mambo , and Silva ), including Plone which they use for their own site (funny/annoying that the entire site has no RSS-feeds, nor is it possible to comment on the articles), following Matt's approach by exluding many CMS that seem not to fit the criteria. It is somewhat strange that OpenAdvantage cuts away Magnolia because it "Requires J2EE server; difficult to install and configure; more of a framework than CMS", and proceed to include Apache Lenya in the full evaluation. Magnolia does not require a J2EE server. It runs on Tomcat just like Lenya does (maybe it's an idea to bundle Magnolia with Jetty to make it seem more lightweight). I'm still sure that OpenAdvant

Encrypting and Decrypting with Spring

I was recently working with protecting some sensitive data in a typical Java application with a database underneath. We convert the data on its way out of the application using Spring Security Crypto Utilities . It "was decided" that we'd be doing AES with a key-length of 256 , and this just happens to be the kind of encryption Spring crypto does out of the box. Sweet! The big aber is that whatever JRE is running the application has to be patched with Oracle's JCE  in order to do 256 bits. It's a fascinating story , the short version being that U.S. companies are restricted from exporting various encryption algorithms to certain countries, and some countries are restricted from importing them. Once I had patched my JRE with the JCE, I found it fascinating how straight forward it was to encrypt and decrypt using the Spring Encryptors. So just for fun at the weekend, I threw together a little desktop app that will encrypt and decrypt stuff for the given password

What I've Learned After a Month of Podcasting

So, it's been about a month since I launched   GitMinutes , and wow, it's been a fun ride. I have gotten a lot of feedback, and a lot more downloads/listeners than I had expected! Judging the numbers is hard, but a generous estimate is that somewhere around 2000-3000 have listened to the podcast, and about 500-1000 regularly download. Considering that only a percentage of my target audience actively listen to podcasts, these are some pretty good numbers. I've heard that 10% of the general population in the western world regularly listen to podcasts (probably a bit higher percentage among Git users), so I like to think I've reached a big chunk of the Git pros out there. GitMinutes has gathered 110 followers on Twitter, and 63, erm.. circlers on Google+, and it has received 117 +'es! And it's been flattr'ed twice :) Here are some of the things I learned during this last month: Conceptually.. Starting my own sandbox podcast for trying out everythin

Git tools for keeping patches on top of moving upstreams

At work, we maintain patches for some pretty large open source repositories that regularly release new versions, forcing us to update our patches to match. So far, we've been using basic Git operations to transplant our modifications from one major version of the upstream to the next. Every time we make such a transplant, we simply squash together the modifications we made in the previous version, and land it as one big commit into the next version. Those who are used to very stringent keeping of Git history may wrinkle their nose at this, but it is a pragmatic choice. Maintaining modifications on top of the rapidly changing upstream is a lot of work, and so far we haven't had the opportunity to figure out a more clever way to do it. Nor have we really suffered any consequences of not having an easy to read history of our modifications - it's a relatively small amount of patches, after all. With a recent boost in team size, we may have that opportunity. Also the need for be

Managing dot-files with vcsh and myrepos

Say I want to get my dot-files out on a new computer. Here's what I do: # install vcsh & myrepos via apt/brew/etc vcsh clone https://github.com/tfnico/config-mr.git mr mr update Done! All dot-files are ready to use and in place. No deploy command, no linking up symlinks to the files . No checking/out in my entire home directory as a Git repository. Yet, all my dot-files are neatly kept in fine-grained repositories, and any changes I make are immediately ready to be committed: config-atom.git     -> ~/.atom/* config-mr.git     -> ~/.mrconfig     -> ~/.config/mr/* config-tmuxinator.git       -> ~/.tmuxinator/* config-vim.git     -> ~/.vimrc     -> ~/.vim/* config-bin.git        -> ~/bin/* config-git.git               -> ~/.gitconfig config-tmux.git       -> ~/.tmux.conf     config-zsh.git     -> ~/.zshrc How can this be? The key here is to use vcsh to keep track of your dot-files, and its partner myrepos/mr for o