[OOTB-hive] WG: [ADDONS] Proposed activity details

Martin Cosgrave martin at ocretail.com
Wed Aug 20 21:57:35 BST 2014


On 20/08/14 00:48, Axel Faust wrote:
>
> *_Activity #1: Form list of Order-reviewed and –accepted Alfresco 
> add-ons_*
>
> ·Non-commercial / open-source add-ons only (anyone – ourselves 
> included – should be able to use it / modify it)
>
essential

> ·Add-ons listed on addons.alfresco.com only (this is and should remain 
> the central directory)
>
+1

> ·Definition of detailed eligibility / acceptance criteria
>
> oEligibility: Do we only include add-ons in the narrow sense of 
> extension of Repository and Share, or tools / scripts for 
> administration and/or development as well?
>
The wider sense, otherwise how do we include the Loftux install script? ;-)

> ·Successful compatibility test on most recent / most stable Alfresco 
> CE version(s) (currently Alfresco 4.2f + Alfresco 5.0a - may be a 
> single version in the future)
>
I was thinking about this. While it would be lovely to track the latest 
and greatest, I feel strongly that we should not, but instead track the 
most stable, in this case 4.2.f. We will simply not have, and probably 
never will have the resources to track the bugs on the new releases in 
the way that alfresco can.

> ·Successful compatibility test with other add-ons already reviewed / 
> accepted
>
> ·Successful compatibility test with current supported client environment
>
Hopefully we can have a one-time setup where the addon submitter 
verifies their addon with our test systems, and gives us a test suite 
which passes in that environment. To be included they would need to sign 
up with us I think, and if they didn't respond to our technical 
questions, possibly a couple of years in the future, they would get 
removed. We should track the bugs and assign the fixes to them (if we 
can, depends on how important we are or they are!) and expect to get 
tests added to the suite when a bugfix is applied, which demonstrate the 
fix.

> ·Basic functionality test from user perspective (using available 
> documentation)
>
Selenium scripts for functionality tests would be something to aspire to.

> ·General technical review
>
> oPotential security issues (unsecured services, improper runAs usage, 
> lack of traceability)
>
> oPotential scalability issues
>
> oMT capability
>
> oConfigurability
>
> oDe-installability / de-installation procedure
>
> oDevelopment best practices (details TBD, e.g. namespacing, support of 
> separate Repo + Share + SOLR instances, use of public services, no 
> hard override of Alfresco out-of-the-box files or beans …)
>
This seems like a whole separate track to me, and I want to be on it! 
"TECH", or do we have that already?

> Rationale: Our list needs to provide a significant “added value” to 
> complement addons.alfresco.com without being “just another top 10 
> list” that you can find on some blogs already. The majority of us 
> should be able to say “I would have no qualms using that add-on in 
> production” for every add-on we list. Our tests / review should never 
> be a replacement for proper QA by the add-on developer, instead 
> focusing more on the “Model Citizen” aspects, interoperability and 
> compatibility. As a group, we should have enough experience to review 
> add-ons concerning the more advanced concepts (MT, scalability, 
> de-installation) which a single (beginner) developer may not yet be 
> aware of but can hurt unaware add-on users the most…
>
That sounds like the disclaimer that we will need to have at the bottom 
of every "BE" addon page ;-)


> I would to see us come up with staggered sets of acceptance criteria, e.g.
>
> ·Mandatory core criteria that all add-ons must pass unconditionally 
> (e.g. CE compatibility, compatibility with at least X different 
> clients, separate Repo + Share + SOLR instances, no hard overrides…)
>
> ·Secondary criteria with limited allowed violations (e.g. improper 
> runAs usage, compatibility with other add-ons…)
>
> ·Tertiary criteria that will result in some sort of flagging, warning 
> or simple documentation within our listing
>
Anything that can be automated should be prioritised IMHO. (Sidebar: 
what is "improper" runAs usage and how can we detect it? Remind me never 
to tell you about the time I extended the auth system so it didn't need 
ALF_TICKET any more ;-) I suspect there was lots of improper runAs in 
that project).

Static code analysis is a fascinating topic but somewhat beyond me. 
"Compatibility with other addons" might be more straight forward in a 
way, as long as we have many automated tests to be able to understand 
which addon is breaking things (i.e. automated builds of the system 
excluding each addon one by one).


> *_Activity #2: (long term) Continuous update / maintenance of add-on 
> list_*
>
> ·Trigger #1: Updated Alfresco CE release
>
I strongly feel we should only track the stable release, i.e. 4.2.f at 
the moment, maybe 5.0.e/f/g at some point? But maybe this is a cop out 
for our "customers" (who don't pay us :P) who would like the newer 
releases? We would have a lot less effort to expend if we just tracked 
the senior stable version though. Who knows, we may never move to 5.0, 
just extend, enhance and fix 4.2? Not saying that's a goal but it could 
happen. Also we could call it "BE 42" and Boriss would be ecstatic (he's 
a fan of Hitch Hiker's Guide to the Galaxy) ;-D


> ·Creation / Setup of tooling / test automation for basic compatibility 
> / functionality tests
>
> oPrepared reference environment (e.g. Amazon Machine Image incl. 
> reference data set + CloudFormation template)
>
I would love to standardise on Vagrant + puppet for this. It makes it 
super easy to open source the devops config that way.

> oSource / build project to aggregate all / subset of listed add-ons 
> for testing
>
> oIndividual UI test script / test suite for each add-on (Selenium or 
> anything else (not my strong area))
>
> oPotential test script / test suite for backend services of each add-on
>
Probably my earlier cursory reading of this mail triggered my thoughts 
which I've already mentioned above about tests. WE NEED THEM! :D Also I 
have done a bit with Selenium so I am up for writing scripts and/or 
knowledge transfer.

> ·Flagging of listed add-ons that no longer match acceptance criteria
>
One of which is the developer responding to an occasional 
re-confirmation email.

> ·Dropping / removal of listed add-ons that repeatedly failed 
> acceptance criteria / are no longer actively maintained
>
Everyone who contributes to the Honeycomb should know that bees always 
remove decaying matter from their hives. When the decaying matter is too 
big to remove (e.g. a mouse gets in and dies) they surround it with 
propolis which is like bee duct tape: http://en.wikipedia.org/wiki/Propolis


Buzz buzz
Martin
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.xtreamlab.net/pipermail/ootb-hive/attachments/20140820/0ccfb64c/attachment-0001.html>


More information about the OOTB-hive mailing list