Incomplete installation of subversion

Andrey Repin
Fri Aug 13 18:00:00 GMT 2010

Greetings, Phil Betts!

>>>> >> <stdout>:curl -iI -H "Accept-Encoding: gzip" -s -- ""
>>>> >> HTTP/1.1 200 OK
>>>> >> Date: Thu, 12 Aug 2010 06:59:40 GMT
>>>> >> Server: Apache/2.0.52 (Red Hat)
>>>> >> Last-Modified: Tue, 10 Aug 2010 16:28:21 GMT
>>>> >> ETag: "18e01b8-a7413-9f101340"
>>>> >> Accept-Ranges: bytes
>>>> >> Vary: Accept-Encoding
>>>> >> Content-Encoding: gzip
>>>> >> Cache-Control: max-age=0
>>>> >> Expires: Thu, 12 Aug 2010 06:59:40 GMT
>>>> >> Content-Type: application/octet-stream
>>>> > Works for me with wget:
>>>> Of course. It's just you can't launch it after wget - file don't have rights
>>>> to execute it.
>>> chmod +x ?
>> Indeed, yet again, it's not the point of my question.
>> I have download manager processing downloads from my web browser.
>> It's quite enough for me. When server behave correctly.

> There's nothing wrong (in this regard) with the server.  See

> | In HTTP, it SHOULD be sent whenever the message's length can be
> | determined prior to being transferred

RFC 2119 is my most loved document. :)
I know the meaning of these words, and as my experience over fifteen years
suggest, the proper Content-Length header for downloadable content is more
common across the internet. Your site is only third in these years that don't
know what it's doing. And I don't really care about it's size or respect
someone else put into it. I do respect it highly, trust me, else i'd not
bother reporting issues with it.

> You forced it to use gzip encoding,

I *suggested*, not forced. And since server accepted suggestion, I expect it
to have proper headers in response.

> which is often a streaming process, and in general a server won't know in
> advance how long the content will be.

Not for downloadable content, again. The HTML page transfer is often a stream,
yes (not all of the server-side processors have ability to pack the output,
and not all of the site owners choose to use this ability, even if it present,
due to considerable CPU usage increase).
For downloadable content, the size is known beforehand in most cases.

> Remove the -H (and -I) and curl works just fine (and the content is
> shorter than the gzipped version).

I know, that's how I've discovered the issue.

> In fact, curl works just fine even with the -H option, as long as you
> remove the -I, and remember to gunzip the contents.

curl, yes, I know that... Ah well, let's round up the discussion, it's leading
to nowhere, it seems.

> You had the choice of:
> a) criticizing the set-up of one of the web's largest and most reliable
>    download sites or
> b) pausing to consider whether perhaps you'd missed something in your
>    HTTP class
> I think perhaps you made the wrong choice.

 Andrey Repin ( 13.08.2010, <21:40>

Sorry for my terrible english...

Problem reports:
Unsubscribe info:

More information about the Cygwin mailing list