[sword-devel] InstallMgr
Greg Hellings
greg.hellings at gmail.com
Thu May 14 21:11:52 MST 2009
On Thu, May 14, 2009 at 9:53 PM, Troy A. Griffitts <scribe at crosswire.org> wrote:
> Thank Matthew. Yes Daniel, we have had an established standard for
> installing modules for quite some time which uses FTP for remote module
> installation. I would like all of our client applications to support all of
> our officially supported install methods.
>
> The reason for this is so we can assure a content provider that their
> content will be available to 'SWORD applications' if they:
>
> x y z
>
> Currently, we can say this, but we have to add a disclaimer: "JSword based
> apps will not work unless you also zip up all of your modules and make these
> zips available as HTTP downloads."
>
> I wish for this disclaimer to shrink, not grow, so I am trying to herd in
> the development efforts and ask all frontend developers to ensure that their
> applications at least support the already longstanding installation policies
> in place.
>
>
> Now concerning new methods, like HTTP access. Matthew is correct that we
> hope to add support for this in the 1.6.x branch of the engine. At that
> time, we will ask all frontend developers to expose this functionality as
> well so we can then give content creators one more option (CD, FTP, ... +
> HTTP) to distribute their content.
>
>
> The details of how this will be implemented are being discussed, but whether
> we SHOULD implement HTTP access, I think is unanimous.
>
> I would like for this be implemented in a which doesn't depend on scripts
> for minimum HTTP support for a repository. Following the same theme as
> other methods: "Just expose your working SWORD module library via HTTP."
>
> This is difficult to implement for us, but we can do it once in C++ and have
> all frontends get the new functionality, and then all content providers have
> an easy means to expose content.
>
> On top of that, we may add what I've been labeling as CACHING / OPTIMIZATION
> mechanisms to make downloading quicker, but these will require the content
> providers be more savvy, possibly implement cron jobs to zip up folders,
> etc. But these mechanisms will be optional for the content providers. They
> will not need to do these things to have a valid repository.
>
> Imagine using your favourite SWORD application installer to install a
> library of all the materials your ministry uses to reach your people group.
> Everything is just how you want it on your computer.
> Now you can copy it to a CD as-is, and it is a valid installation source to
> any other SWORD frontend. You can pass that CD to a church with a website
> and they can copy it as-is to their website and serve the entire library
> with SWORDweb. They can point an FTP server to it as-is, and it becomes a
> valid remote installation source for their congregation or other people
> working in the same domain and running any SWORD client,
> ...
> and in the future, they hopefully will be able to allow HTTP access to the
> same location as-is and it becomes a valid remote installation source.
>
> The zip scripts and ls-LR files are all fine to help improve speed and
> reduce processor power, but I am willing to bite the bullet and do the work
> to make things functional without them, if it keeps this easy to propagate
> scenario available.
>
> Am I helping explain my reasoning?
While I'm not sure that adding ls-lR would improve processing speed
any more than getting the current directory listing, I have a few
additional questions about "other methodologies" support.
Does it require a difference (for the C++ front ends) to add support
for FTP InstallMgr methods than it does for them to add a file://
version? It seems that, if so, this is a problem with the design of
InstallMgr. I would imagine that the interface between SWORD and the
application would be identical in all those cases -- all the
application needs is to retrieve the information and provide a way for
the user to input new locations. I get the impression from your
previous messages that the front ends will need to change when you add
HTTP support to the library then front ends will need to add another
portion to their install manager that allows the user to create HTTP
install locations separate from FTP or file:// ones -- is that the
case?
Secondly - while asking the people to add a simple cron job that does
"ls -lR > /var/www/htdocs/ls-lR" or whatever hardly seems like it's
more difficult than asking them to actually install and configure a
web browser, if you want to do it completely without that, then
doesn't cURL already support communication over HTTP? Can't the
current FTP transporting class that uses cURL (obviously not the one
that uses ftplib) just ask cURL whether the module was HTTP or FTP and
process it accordingly? All that then needs to be done is parse the
HTML file that came back from the server, and only grab the links that
are not part of a special HTTP control string (those that begin with
"?") and those that do not lead to a higher directory. A good HTML
parsing library should be able to hand you full-path HREF values and
just compare them against the current directory. If the HREF is
shorter than the current directory is is the "parent directory" link,
if the file portion begins with or contains a ? then it is one of the
control characters for the directory - such as the "sort by" links at
the top of the directory listing. Obviously, I'm going as per the
Apache standard directory listing page. I don't have a handy install
of IIS to check its directory listing page, but almost all such pages
I've seen before are very close to uniform when it comes to the actual
links that they have.
Is this a basic idea of how it could/should work? Or do we want it to
work with some internal inclusion of times when cURL isn't available
and we've had to fall back to ftplib? I think you'll fall into
problems there and, for any such system, it would be up to the
application developer to extend the necessary classes to implement the
HTTP library at their own level. The only technical issue I can see
is what HTML parsing library to use and also (depending on the answer
to my first question) why the application would have to treat files,
ftp or http sources different if the library actually handles all such
installations.
--Greg
>
> -Troy.
More information about the sword-devel
mailing list