[Sugar-devel] List of activities on ASLO
Sebastian Silva
sebastian at fuentelibre.org
Mon Mar 14 11:20:00 EDT 2016
Guys, one thing that is missing here is the bundle_id (org.laptop.Terminal).
This is supposed to be the unique identifier of each activity. Otherwise
it will be very difficult and error prone to match, prune duplicates, etc.
Regards,
Sebastian
El 14/03/16 a las 10:18, Chris Leonard escibiĆ³:
> This time with attachment.
>
> On Mon, Mar 14, 2016 at 11:18 AM, Chris Leonard
> <cjlhomeaddress at gmail.com> wrote:
>> Thanks for the scripts Tony. I filled in other fields with grep of
>> the aslo# files. I think scraper.py does some sorting that causes
>> the collection list and the grep scrapes to not align properly (in
>> part based on case-sensitive sorting). It was laborious, it would be
>> nice to have an improved version of such a data collection tool for
>> periodic monitoring of activity status (po filename or latest version
>> numuber and update date).
>>
>> I thought others might be interested in the results (so far).
>>
>> What I would love to add as columns on this spreadsheet are:
>>
>> ported to gtk3?
>>
>> set up for i18n?
>>
>> repo location?
>>
>> POT in Pootle.
>>
>> In going through this sheet, there are some apparent duplications of
>> activities (possibly for single language support).
>>
>> cjl
>>
>>
>> On Sat, Mar 12, 2016 at 12:57 AM, Tony Anderson <tony at olenepal.org> wrote:
>>> Hi, Chris
>>>
>>> I put together a process to do that some months ago. I can give you a
>>> working part of it which will give you two critical items:
>>> the title of the activity and the addon where it is found.
>>>
>>> First run the collector.py. This will access activities.sugarlabs.org and
>>> download six web pages giving 100 activities each (except the last), for a
>>> total
>>> of 567 activities. This will appear as aslo1, ..., aslo6. Next run
>>> scraper.py. This uses beautifulsoup to scrape the six web pages giving six
>>> collections,
>>> collection1, ..., collection6. Each line gives the addon and title of an
>>> activity. The scraper.py program does not access the network. You may need
>>> to install
>>> beautifulsoup to run the scraper.py program.
>>>
>>> The original collected more information from each activity with a goal of
>>> building a csv file that could be used to record information like the size
>>> of the
>>> activity, the most recent version, whether po is supported, whether gtk+3 is
>>> supported, whether it is a sugar-web or python activity, and whether it uses
>>> gstreamer 0.1 or 1.0 and so on.
>>>
>>> Tony
>>>
>>>
>>> On 03/12/2016 10:22 AM, Chris Leonard wrote:
>>>> Can someone with systematic access to ASLO do a data dump for me? Any
>>>> format will do (txt, csv, xls, ods, etc.)
>>>>
>>>> I am interested in reviewing all known activities (at least those in
>>>> ASLO) for 1i8n/L10n and investigating further to see if we can
>>>> implement i18n/L10n where it does not exist. I would also like to
>>>> check on presence of repo links. I know this was only recently
>>>> requested, but I might as well check on it as I'll need it for
>>>> i18n/L10n follow up.
>>>>
>>>> I've been updating:
>>>>
>>>>
>>>> https://wiki.sugarlabs.org/go/Translation_Team/Pootle_Projects/Repositories
>>>>
>>>> data fields of interest
>>>>
>>>> Activity name
>>>> Activity number
>>>> Activity version (latest)
>>>> Author(s) name
>>>> Author(s) number
>>>> Repo link (if available)
>>>>
>>>> Thanks in advance for any assistance.
>>>>
>>>> cjl
>>>> _______________________________________________
>>>> Sugar-devel mailing list
>>>> Sugar-devel at lists.sugarlabs.org
>>>> http://lists.sugarlabs.org/listinfo/sugar-devel
>>>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 473 bytes
Desc: OpenPGP digital signature
URL: <http://lists.sugarlabs.org/archive/sugar-devel/attachments/20160314/85e425a6/attachment.sig>
More information about the Sugar-devel
mailing list