After the failed upgrade attempt a fortnight ago I spent a lot of time looking in the wrong direction. The main problem was that users were unable to log in. So I thought it must have been session related. I tried many experiments with cookies, cache and sessions locally but couldn't workout what was going wrong. Everything seemed to be working correctly apart from the failed logins.
z-index changes the stacking order or layering of positioned elements. If no element is positioned then the stacking order gets higher, closer to the front, for each descendant.
Each positioned element that has been assigned a z-index other then auto, creates a stacking context, which may be easier to explain in another context lets use a sandpit. The sandpit is positioned absolutely so the children don't move it an make a bigger mess and has a z-index of 1.
CSS gives web designers the power to style many pages simply and to change the look of a whole site in one place. If you look at a site as a whole you can simplify your style sheets and make them smaller and easier to maintain.
Let's look at the basic structure of a rule to make the rest of the explanation easier to understand.
There are different types of selectors which I am not going into here except to say that you can have multiple selectors separated by a comma.
I wonder and muse on whether, in reality, It was wise to give browsers the ability to render markup using the 'Tag Soup' rendering engine? or at least such a good ability.
Of course we know that without it more than half the web would vanish overnight, but I am constantly staggered by quite how effective it is at rendering code that is so completely wrong, code that you just stare at and honestly question how or why it has any right being parsed.
I further muse whether it may have been somehow! better to have had a mechanism by which if one wanted to make use of the power and flexibility and benefits derived from Standards and CSS that perhaps this should perhaps have been a function provided through the strict use of 'Strict DTD's only and that in using these the Tag Soup engine would be asked to be far less lenient with code parsing and rendering, thus clearly de-marking a line between good code and bad; I know that true XHTML provides for this and also that the XMl parser has been criticized for being far too strict and non fault tolerant, but I can only see that as a good thing , it says you Must write proper code or else we will return a page list of your errors, not that great a hardship or impossible goal to reach.
Santa is bringing us a shiny new server.
Yes, it was many months ago I first promised to get a new server and now is finally it is going to happen. Thanks for being patient.
By the second week of January we should have finished the move to the new server.
Hopefully the downtime will be minimal.
Have a Merry Christmas, Happy Holidays and look forward to a faster site in the new year.