[ubuntu-web] Fwd: Re: Homepage Nav

Andrew Mason andrew at miniatureworldmaker.com.au
Tue Jul 8 14:01:49 BST 2008


Forgot to reply to all sorry.


----------  Forwarded Message  ----------

Subject: Re: [ubuntu-web] Homepage Nav
Date: Tue, 8 Jul 2008
From: Andrew Mason <andrew at miniatureworldmaker.com.au>
To: "Matthew Nuzum" <matthew.nuzum at canonical.com>

On Tue, 8 Jul 2008 03:53:20 am you wrote:
> On Sat, Jul 5, 2008 at 6:48 AM,  <andrew at miniatureworldmaker.com.au> wrote:
> > Not sure about your feelings but I generally try and avoid javascript as
> > a requirement but try to use it for 'enhancement'.

>
> Precisely our view for ubuntu.com as well.
>
> > So can you provide a little more information about the server side set
> > up.
>
> We can serve semi-static xml and edit it using our content management
> system. 

> The servers are isolated from each other and the web at large 
> so they cannot proxy content. For example, an RSS aggregator would not
> work on ubuntu.com because the servers are in a type of jail and
> cannot ping other websites.
>
> > Is there no way we can have the data, unified / proxied...even read only
> > ?
>
> There is a way but it is challenging. Our security policies are very
> very strict. There are few sites that make the front page of slashdot
> and digg when they have a security vulnerability. Ours is one of them
> and I've been there personally[1] and its the kind of publicity I
> *don't* like. Therefore you'll probably see this mentioned again a few
> times but we will always try to err on the side of caution.
>
> Plus, our site is prone to tremendous spikes in traffic and by keeping
> the server side of things as lightweight as possible we ensure the
> site is available.
>

Ok so what is the scope of what we / you can do ?

From my perspective , in order to get the navigation / content from each site 
to be identical we need to have a single source from which each of the 
different sites (that you want to have the same navigation) get their content 
from unless you want to do this client side. 

If you want to do this client side you get the following advantages:
1) small / tiny very low load on the server
2) we can pull from anywhere the client can reach with thier browser

you get the following disadvantages :
1) have to use javascript or xslt
2) Both are browser dependant
3) we have already discussed javascript and the pitfalls with that. 
4) If we use xslt, all our source data will _have_ to be XML. If it is already 
this isn't a problem of course.


If you can get this serverside, no matter how many hoops you need to jump 
through it will probably be worth it. If the content can be  pulled or pushed 
to each site then we no longer really have a problem.

I understand that there are significant security considerations, but there 
_should_ to be a way of getting unified content to each site. It just depends 
on how hard this actually is.

I am more than happy to do an xml /xslt prototype for you but I think deep 
down you probably would prefer to have the content formed before sending it 
out to the clients too :)

Can we get the content PK encrypted from somewhere and decrypt it so that the 
data is guaranteed to have come from a known good location ? What sort of 
security hoops are we talking about ? Is there a security team member who can 
join in the discussion with us so that we know what their concerns are and 
what they would prefer us to do. 

I'm sure if we explain the problem they will have some idea that we can work 
with. 

Andrew




















> [1]
> https://wiki.ubuntu.com/UbuntuWeeklyNewsletter/Issue52#head-b009291e4151391
>137b8f04a53adea995d0ee280



-------------------------------------------------------



More information about the Ubuntu-website mailing list