[Sugar-devel] List of activities on ASLO
Tony Anderson
tony_anderson at usa.net
Sat Mar 12 04:32:30 EST 2016
Hi, Sebastian
I have tried to follow the Sugar Network project, but I have never
gotten a good enough understanding to see its benefits. Unfortunately
for a long time it was a shell without content. Then it appears to be
bound to internet access which really makes it unusable in the
deployements I work with. At one time you talked about a sneakernet
approach to email which I think is a critical need - but nothing seemed
to come of it.
I don't understand why ASLO doesn't support Chris Leonard's requirement.
The use of the addon number adds one more level of indirection, but it
should
be simple for ASLO to provide a url which returns an index of addons and
activity names. What I did was to write python code that creates this
missing list
of addons, sugar activity names, and version numbers (the version number
on the activity page which is often not the most current). The code
needs some tweaking to deal with 'hidden' versions more recent than the
one on the page. There is also a need to find the github location of the
activity. One drawback of the migration to github is that the activities
are under several different owners folders. According to the python code
there are not 567 activities (however,
about 100 are the gcompris activities which Aleksey implemented some
years back. Sadly, they no longer work. It is simpler to install the
GCompris bundle in Gnome and use a wrapper to launch in Sugar which at
least gives us access to the most recent versions of the various activities.
Tony
On 03/12/2016 05:08 PM, Sebastian Silva wrote:
> Hi Tony,
> I find we often we do similar things with different approach. This is
> always good as I think it confirms we have similar observations from
> the field (from accross the globe!).
>
> From 2011 to 2014 Alsroot and I, developed the Sugar Network, with
> UI/concept design and also resource planning from Laura.
> The goal was to broaden access / replace the ASLO Library (among
> other, more ambitious goals).
> For this, among other things, Alsroot developed an API that would
> allow to query (and feed!) the ASLO Library... eventually even replace
> it.
>
> The docs for the API are here:
> http://wiki.sugarlabs.org/go/Sugar_Network/API
>
> For instance, it is possible to get the list of activities thus:
> http://node.sugarlabs.org/context?type=activity&offset=0
>
> Change the offset to get all 470 entries, hard limit on server. I
> don't think this is synchronizing with ASLO but it was designed to do so.
>
> We could not spark interest either with Sugar Labs or with any other
> deployment, and eventually the Ministry of Education of Peru lost
> interest in supporting the project. However I would be very happy if
> Alsroot's obssesively polished work could be used for much more. In
> Peru this continues to be used and deployed (and visible with the web
> frontend at http://network.sugarlabs.org/ ). Laura continues to
> monitor and admin the contents provided by the children, as well as
> monitor statistics. I have stopped developing it because I see direct
> way to deploy a better user experience (yet). However the dream of a
> Sugar Doers Network lives on.
>
> A short intro (spanish only) to the intended usage (with Sugar shell
> integration) is here:
> http://www.dailymotion.com/embed/video/xrapcp
> (this is an early version 0.2 - last one was 0.9).
>
> A revisiting of these ideas should be a priority for Sugar Labs, IMHO.
>
> Regards,
> Sebastian
>
>
>
> El 12/03/16 a las 00:57, Tony Anderson escibiĆ³:
>> Hi, Chris
>>
>> I put together a process to do that some months ago. I can give you a
>> working part of it which will give you two critical items:
>> the title of the activity and the addon where it is found.
>>
>> First run the collector.py. This will access activities.sugarlabs.org
>> and download six web pages giving 100 activities each (except the
>> last), for a total
>> of 567 activities. This will appear as aslo1, ..., aslo6. Next run
>> scraper.py. This uses beautifulsoup to scrape the six web pages
>> giving six collections,
>> collection1, ..., collection6. Each line gives the addon and title of
>> an activity. The scraper.py program does not access the network. You
>> may need to install
>> beautifulsoup to run the scraper.py program.
>>
>> The original collected more information from each activity with a
>> goal of building a csv file that could be used to record information
>> like the size of the
>> activity, the most recent version, whether po is supported, whether
>> gtk+3 is supported, whether it is a sugar-web or python activity, and
>> whether it uses
>> gstreamer 0.1 or 1.0 and so on.
>>
>> Tony
>>
>> On 03/12/2016 10:22 AM, Chris Leonard wrote:
>>> Can someone with systematic access to ASLO do a data dump for me? Any
>>> format will do (txt, csv, xls, ods, etc.)
>>>
>>> I am interested in reviewing all known activities (at least those in
>>> ASLO) for 1i8n/L10n and investigating further to see if we can
>>> implement i18n/L10n where it does not exist. I would also like to
>>> check on presence of repo links. I know this was only recently
>>> requested, but I might as well check on it as I'll need it for
>>> i18n/L10n follow up.
>>>
>>> I've been updating:
>>>
>>> https://wiki.sugarlabs.org/go/Translation_Team/Pootle_Projects/Repositories
>>>
>>>
>>> data fields of interest
>>>
>>> Activity name
>>> Activity number
>>> Activity version (latest)
>>> Author(s) name
>>> Author(s) number
>>> Repo link (if available)
>>>
>>> Thanks in advance for any assistance.
>>>
>>> cjl
>>> _______________________________________________
>>> Sugar-devel mailing list
>>> Sugar-devel at lists.sugarlabs.org
>>> http://lists.sugarlabs.org/listinfo/sugar-devel
>>
>
> _______________________________________________
> Sugar-devel mailing list
> Sugar-devel at lists.sugarlabs.org
> http://lists.sugarlabs.org/listinfo/sugar-devel
More information about the Sugar-devel
mailing list