Planeta GNOME Hispano
La actividad Hispana de GNOME 24 x 7

22 de October de 2019

Testing Indico opensource event management software

Indico event management tool

After orgnazing a bunch of conferences in the past years I found some communities had problems choosing a conference management software. One alternative or others had some limitations in one way or another. In the middle I collected a list of opensource alternatives and recently I’m very interested in Indico. This project is created and maintained by the CERN (yes, those guys who invented the WWW too).

The most interesting reasons for me are:

Jornadas WMES 2019

With the help of Franc Rodríguez we set up an Indico testing instance at https://indico.olea.org. This system is ready to be broken so feel free to experiment.

So this post is an invitation to any opensource community wanting to test the feasiability of Indico for their future events. Please consider to give it an opportunity.

Here are some items I consider relevant for you:

And some potential enhancements (not fully check if currently available or not):

  • videoconf alternatives: https://meet.jit.si
  • social networks integration
    • Twitter
    • Mastodon
    • Matrix
  • exports formats
    • pentabarf
    • xcal, etc
  • full GDPR compliance (seems it just need to add the relevant information to your instance)
  • gravatar support
  • integration with SSO used by the respective community (to be honest I didn’t checked the Flask-Multipass features)
  • maybe a easier inviting procedure: sending inviting links to an email for full setup;
  • map integration (OSM and others).

For your tests you’ll need to register at the site and contact me (look at the botton of this page) to add you as a manager of your community.

I think it would be awesome for many communities sharing a common software product. Isn’t it?

PD: Great news, next March CERN will host an Indico meeting!

17 de October de 2019

Gnome-shell Hackfest 2019 – Day 3

As promised, some late notes on the 3rd and last day of the gnome-shell hackfest, so yesterday!

Some highlights from my partial view:

  • We had a mind blowing in depth discussion about the per-crtc frame clocks idea that’s been floating around for a while. What started as “light” before-bedtime conversation the previous night continued the day after straining our neurons in front of a whiteboard. We came out wiser nonetheless, and have a much more concrete idea about how should it work.
  • Georges updated his merge request to replace Cogl structs with graphene ones. This now passes CI and was merged \o/
  • Much patch review happened in place, and some other pretty notable refactors and cleanups were merged.
  • The evening was more rushed than usual, with some people leaving already. The general feeling seemed good!
  • In my personal opinion the outcome was pretty good too. There’s been progress at multiple levels and new ideas sparked, you should look forward to posts from others :). It was also great to put a face to some IRC nicks, and meet again all the familiar ones.

Kudos to the RevSpace members and especially Hans, without them this hackfest couldn’t have happened.

16 de October de 2019

Gnome-shell Hackfest 2019 – Day 2

Well, we are starting the 3rd and last day of this hackfest… I’ll write about yesterday, which probably means tomorrow I’ll blog about today :).

Some highlights of what I was able to participate/witness:

  • Roman Gilg of KDE fame came to the hackfest, it was a nice opportunity to discuss mixed DPI densities for X11/Xwayland clients. We first thought about having one server per pixel density, but later on we realized we might not be that far from actually isolating all X11 clients from each other, so why stop there.
  • The conversation drifted into other topics relevant to desktop interoperation. We did discuss about window activation and focus stealing prevention, this is a topic “fixed” in Gnome but in a private protocol. I had already a protocol draft around which was sent today to wayland-devel ML.
  • A plan was devised for what is left of Xwayland-on-demand, and an implementation is in progress.
  • The designers have been doing some exploration and research on how we interact with windows, the overview and the applications menu, and thinking about alternatives. At the end of the day they’ve demoed to us the direction they think we should take.

    I am very much not a designer and I don’t want to spoil their fine work here, so stay tuned for updates from them :).

  • As the social event, we had a very nice BBQ with some hackerspace members. Again kindly organized by Revspace.

14 de October de 2019

Gnome-shell Hackfest 2019 – Day 1

So today kickstarted the gnome-shell hackfest in Leidschendam, the Netherlands.

There’s a decent number of attendants from multiple parties (Red Hat, Canonical, Endless, Purism, …). We all brought various items and future plans for discussion, and have a number of merge requests in various states to go through. Some exciting keywords are Graphene, YUV, mixed DPI, Xwayland-on-demand, …

But that is not all! Our finest designers also got together here, and I overheard they are discussing usability of the lock screen between other topics.

This event wouldn’t have been possible without the Revspace hackerspace people and specially our host Hans de Goede. They kindly provided the venue and necessary material, I am deeply thankful for that.

As there are various discussions going on simultaneously it’s kind of hard to keep track of everything, but I’ll do my best to report back over this blog. Stay tuned!

13 de October de 2019

Jornadas Wikimedia España WMES 2019: Wikitatón de patrimonio inmueble histórico de Andalucía

Jornadas WMES 2019

En la última entrada ya mencioné que dirigiré un taller sobre edición con Wikidata en las Jornadas Wikimedia España 2019. Aquí presento los enlaces y referencias que usaremos en el taller. Nos centramos en el caso del patrimonio histórico inmueble andaluz porque llevo un tiempo trabajando con él y estoy familiarizado, pero es extrapolable a cualquier otro ámbito semejante.

Quiero animar a cualquier interesado a participar sin importar tu experiencia con Wikidata. Creo que merecerá la pena. Lo que sí os ruego, por favor, es que todos traigáis ordenador portátil.

Referencias oficiales

Principales servicios Wikimedia de nuestro interés

Material relacionado en los proyectos Wikimedia:

Consultas SPARQL a Wikidata relacionadas:

Otros servicios externos de interés:

Ejemplos de monumentos

Usaremos unos ejemplos como material de referencia. Es muy relevante el de la Alhambra porque es la entrada de la guía de Ándalucía con más datos de todo el catálogo, con mucha ventaja.

Alhambra de Granada

Puente del Hacho

Estación de Renfe de Almería

Agradecimientos

Jornadas WMES 2019

Mi asistencia a las jornadas ha sido posible gracias al soporte económico de la asociación Wikimedia España. Desde aquí mi agradecimiento.

10 de October de 2019

Next conferences

Just to say I’m going to a pair of conferences here in Spain:

At WMES 2019 I will lead a Wikidata workshop about adding historical heritage data, basically repeating the one at esLibre.

At LAS 2019 I plan to attend to the Flatpak workshops and to call for a BoF for people involved in opensource conference organizations to share experiences and reuse tools.

Lots of thanks for the Wikimedia España association and GNOME Foundation for their travel sponsorship. Without their help I could not attend both.

See you in Pamplona and Barcelona.

01 de October de 2019

A new time and life next steps

the opensource symbol

Since the beginning of my career in 1998 I’ve been related with Linux and opensource in me or other way. From sysadmin I grow to distro making, hardware certification and finally consulting, plus some other added skills. Parallel I developed a personal career in libre software communities and got the privilege to give lots of talks particularly in Spain and Ibero-America. That was a big time. All this stopped in 2011 with the combination of the big economic crisis in Spain and a personal psychological situation. All lead me to go back from Madrid to my home city, Almería, to look for health recovering. Now, after several years here I’m ready to take a new step and reboot my career.

Not all this time has been wasted. I dedicated lots of hours to a new project which in several senses has been the inverse of the typical practices in opensource communities. Indeed, I’ve tried to apply most of them but instead in the world-wide Internet now with a 100% hyper-local focus. This mean working in the context of a medium-small city (less than 200k inhabitants) with intensive in-person meetings and Internet communications support. Not all the results has been as successful as I pretended, probably because I kept very big expectations; as Antonio Gramsci said «I’m a pessimist because of intelligence, but an optimist because of will» :-) The effort was developed in what we named HackLab Almería and some time ago I wrote a recap about my experience. To me was both an experiment and a recovering therapy.

That time worked to recover ambitions, a la Gramsci, and to bring relevant important and itinerant events to our nice city, always related with opensource. Retaking the experience of the good-old HispaLinux conferences we were able of hosting a set of extraordinary great technological conferences: from PyConES 2016 to Akademy 2017, GUADEC 2018 and LibreOffice Conference 2019. For some time I thought Almería was the first city to host these three… after I realized Brno did it before! The icing of the cake was the first conference on secure programming in Spain: SuperSEC. I consider all of this a great personal success.

Forgot to mention I enrolled in a university course too, more as a excuse to work in an area for which I have never found time: information and software methodology modeling. This materializes in my degree project, in advanced development state but not yet finished, around the ISO/IEC 29110 norm and the EPF Composer. I’m giving it a final push in the coming months.

Now I’m closing this stage to start a new one, with different priorities and goals. First one is to reboot my professional career, so I’m looking for a new job and started a B2 English certification course. I’m resuming my participation in opensource communities —I’ll attend LAS 2019 next November— and hope to contribute with small but not trivial collaborations to several communities. After all I think the most I’ve been doing all these years has been just shepherding the digital commons.

See you in your recruitment process! ;-)

PS: this is an Spanish version of this post.

30 de September de 2019

Nueva etapa: cambio de época y futuro profesional

Cambiamos de tercio a más de lo mismo pero de otra manera

the opensource symbol

Desde que empecé mi carrera profesional en 1998 casi siempre he estado relacionado con el mundo Linux y el software libre. Si bien no ha sido demasiado brillante tampoco me arrepiento tanto, como se suele decir, de lo que he hecho que de lo que no he hecho y de las oportunidades que hubiera podido explotar. Pero todo cambió en 2011 cuando combinaron la crisis económica española, problemas laborales y, sobre todo, personales que me obligaron a volver desde Madrid a mi ciudad de origen, Almería, para recuperarme. Ha tomado tiempo pero parece que lo hemos conseguido. Fue en esta época cuando surgió lo que acabamos denominando HackLab Almería.

Personalmente la actividad en el HLA fue un experimento para aplicar el bagaje adquirido en conocimientos y prácticas en comunidades abiertas opensource durante más de 10 años pero, en este caso, con un enfoque totalmente inverso: de comunidades principalmente telemáticas con alcance incluso mundial a la orientación radicalmente _hiperlocal_con obligado e intenso ámbito presencial. En aquel momento tenía mucho tiempo disponible y me volqué en crear contenido, identificar y establecer contactos personales y dinamizar una nueva comunidad que pudiera alcanzar inercia y masa crítica autosostenible. También fue en aquella época que en una puntual visita a Madrid —por entonces mi actividad viajera se había reducido a casi cero— tras una motivadora conversación con ALMO empecé a recuperar ilusión perdida y afán de creación que finalmente cristalizaron en una actividad intensa durante meses que sin ataduras profesionales o económicas también sirvió para recuperar habilidades y cultivar otras nuevas, profundizando en un proyecto alineado con mi experiencia y lo suficientemente interesante para mantener permanentemente mi interés. Tiempo útil como terapia para recuperar autoestima, paz de espíritu y rendimiento intelectual.

De camino aproveché para reforzar mi prácticas de la ética hacker: durante años he sido un gran diletante con, tal vez, muchas cosas que decir pero con muy poco impacto. Y esa no es una bonita sensación para un narcisista. Así pues decidí esforzarme en hablar menos y hacer más. Del grado de consecución se podría hablar aparte en otro momento aunque en su momento redacté una retrospectiva. También dediqué interés a profundizar en el conocimiento abierto y los procomunes digitales: MusicBrainz, Wiki Commons, Wikidata, OpenStreetMap, etc.

Por entonces y prácticamente por casualidad se dibujó la oportunidad de traer encuentros tecnológicos importantes —de una manera u otra siempre relacionados con el opensource— a esta mi ciudad, periférica en la periferia y, tal vez, la única isla de España sita en la propia península Ibérica. Si ya tenía experiencia previa promoviendo y colaborando en aquellos congresos HispaLinux el trabajo en PyConES 2016 —gracias Juanlu por la confiaza— fue un salto cualitativo que después se materializó en la celebración de Akademy 2017, GUADEC 2018 y LibreOffice Conference 2019 en Almería. Por algún tiempo pensé que la nuestra sería la primera ciudad en conseguir este triplete… hasta que descubrí que Brno se nos había adelantado :-) Por el camino también inventamos SuperSEC, el primer congreso nacional de programación segura en España.

Ahora doy por finalizada esta etapa en parte bastante frustrado. No estoy satisfecho con todos los resultados, en particular con el impacto local. Mientras preparaba este artículo había pensado entrar en algunos detalles descriptivos pero… ¿para qué? Quien podría haberse interesado no lo hizo en su momento y a mi aún me dolería entrar en retrospectiva y… finalmente ¿para qué? para ser otro lapso desvanecido en la entropía. Sí que me quedo tranquilo de connciencia porque sé que, mejor o peor, me entregué al máximo.

Así pues: cambio de tercio. Un 1 de octubre no es mala fecha para hacerlo. Vuelvo a volcarme en desarrollar mi perfil profesional y, atención querido público, busco trabajo. Obviamente cuanto más próximo y relacionado con el mundillo del software libre y anejos mucho mejor. Y es que aún queda muchísimo por hacer para construir la infraestructura digital libre necesaria para una sociedad digital abierta y quiero seguir siendo parte. Al fin y al cabo creo que todo lo que he hecho desde los años 90 ha sido pastorear los procomunes digitales.

Nos vemos en vuestro proceso de reclutamiento ;-)

PS: esta es la versión en inglés de este artículo.

23 de September de 2019

LibreOffice Conference 2019 by numbers

LibreOffice Conference 2019 badge

LibreOffice Conference 2019 ended and… seems people really enjoyed!

Here I provide some metrics about the conference. Hopefully they’ll be useful for next years.

  • Attendees:
    • 114 registered at website before Aug 31 deadline;
    • 122 total registered at the end of the conference;
    • 102 total of phisically registered at the conference.
  • Registered countries of origin: Albania, Austria, Belgium, Bolivia, Brazil, Canada, Czech Republic, Finland, France, Germany, Hungary, India, Ireland, Italy, Japan, Korea - Republic of, Luxembourg, Poland, Portugal, Romania, Russian Federation, Slovenia, South Africa, Spain, Sweden, Switzerland, Taiwan, Turkey and United KingdomM
  • 4 days: 1 for board and community meetings and 3 for conference talks;
  • 3 tracks;
  • 68 talks, 6 GSoC presentations and 13 lightning talks;
  • 1 new individual certification;
  • 4 social events:
    • welcome party, 70 participants;
    • beach dinner party, 80 participants;
    • teathrical visit to the Alcazaba castle, 50 participants;
    • after conference city visit, 14 participants;
  • 1 hackfest, approximately 50 participants;
  • 1 conference shuttle service bus (capacity for more than 120 persons);
  • Telegram communications:
  • Conference pictures, at least:
  • Weather: two completely unexpected rainy days in Almería o_0
  • About economics, the conference ended with some superavit, which is nice. Thanks a lot to our sponsors for making this possible.

Next are a list of data tables with other more information.

Meals at university cafeteria:

    Sept. 10   Sept. 11   Sept. 12   Sept 13   Total
meals: expected 70 106 106 107 389
meals: served 54 92 97 86 329


T-shirts, ordered to our friends of FreeWear:

type   size (EU)   number
unisex S 9
unisex M 24
unisex L 36
unisex XL 15
unisex XXL 15
unisex XXXL 7
unisex - tight S 1
unisex - tight M 4
unisex - tight L 2
  total 113


The LibOCon overnight stays at Civitas were:

day   number
2019/09/05 1
2019/09/06 1
2019/09/07 5
2019/09/08 32
2019/09/09 57
2019/09/10 75
2019/09/11 77
2019/09/12 77
2019/09/13 64
2019/09/14 13
2019/09/15 3
2019/09/16 3
total overnights: 408


Twitter campaign activity at @LibOCon:

Month  tweets   impressions   profile visits   mentions   new followers
Apr 2 2321 228 9 10
May 6 8945 301 6 19
Jun 3 3063 97 3 5
Jul 3 5355 188 3 13
Aug 10 8388 208 10 2
Sept 75 51200 1246 158 (not available)
totals: 99 79272 2268 189 49



PS: I’m amazed I’ve not blogged almost nothing about the conference until now!!
PD: Added the overnight numbers at the conference hotel.

30 de July de 2019

HackIt 2019, level 3³

Creo que esta prueba nos llevó más del 50% del tiempo del HackIt de este año :-O , pero es el tipo de prueba que nos encanta: sabes lo que hay que hacer, pero es un camino tortuoso, doloroso y complejo. A por ello 🙂

El título de la prueba siempre lleva alguna pista a modo de juego de palabras. Ese cubo en forma de superíndice…

Analizamos el dump y vemos que se trata de un pcap. Lo abrimos con Wireshark y curioseamos un rato.

No puede faltar una prueba con Wireshark en un HackIt que se precie 🙂

Ese puerto tcp/25565 se nos hace conocido…

También se podía deducir que era una captura del protocolo de Minecraft mirando los strings. Aparece algo como «generic.movementSpeed?». Buscándolo en Google nos lleva a Minecraft, sin duda.

Yep, Minecraft. En el servidor 51.15.21.7. Aquí otra vez fuimos troleados por @imobilis… o tal vez se trataba de un easter-egg en la prueba 🙂 El caso es que ese servidor existe (!) y tiene un mundo en el que apareces encima de una torre de la que no es posible salir. Incluso tiene mensajes en algunos carteles (por supuesto los probamos todos, sin éxito), como el de la imagen (Mundo Survival Kots)

Anda que no estuvimos tiempo «jugando» en esta torre. Los mensajes son pistas falsas.

El dump tiene mensajes enviados del cliente (10.11.12.52) al servidor (51.15.21.7) y viceversa. El payload de los mensajes es (parecía!) claro y se puede extraer con tshark.

$ tshark -r dump -T fields -e data

1b0010408d2e07aeae7d91401400000000000040855ae9b632828401
12004a0000000059c86aa10000000000001ac9
0a0021000000028daf8dbd
0a000e000000028daf8dbd

Aquí nos las prometíamos muy felices, porque vimos que había analizadores del protocolo Minecraft para Wireshark, como este o este. Todo muy de color rosa… hasta que nos fijamos en la fecha del último commit: 2010. Qué bien… no nos valen para nada. Así que, nos remangamos, fuimos a por café, y nos pusimos a estudiar la especificación del protocolo Minecraft, que está escrito por alguien que parece que tomaba apuntes de una charla, más que una especificación bien redactada. Hay exactamente 0 ejemplos de las partes más engorrosas (VarInt, packets with compression, …) En fin, nuestro compañero Joserra, un Excel wizard, decidió que nuestros scripts eran una **** mierda y que lo iba a hacer en Excel ¯_(ツ)_/¯

Si tomamos el primer payload, 001b es el tamaño del paquete (27 bytes), 0x10 el packetID y 408d2e07aeae7d91401400000000000040855ae9b632828401 el payload del paquete. El 0x10 es el ID de un paquete de tipo «Player Position» (Bound to server indica que es el cliente el que le envía al servidor). El payload se divide en 4 campos: x (double), feet y (double), z (double), «on ground» (boolean). Todos los paquetes de posición (0x10, server bound) son impares, por lo que terminan en 1 (true, on ground). Nos interesa conocer x, y, z.

x= 408d 2e07 aeae 7d91
y = 4014 0000 0000 0000
z = 4085 5ae9 b632 8284

Para pasar de hex a double, invocamos una macro, hex2dbl

No es la primera vez que resolvemos una prueba con Excel 🙂

y obtenemos las posiciones x,y,z.

Finalmente, generamos un gráfico de dispersión y obtenemos la clave 🙂

@imobilis tuvo que pasarse horas para conseguir mover el jugador de Minecraft por el mapa hasta conseguir trazar el texto. Si nos fijamos siempre empieza de un punto, baja y vuelve a subir a ese punto para trazar la siguiente letra. Analizando el payload, la altura de esa zona superior es distinta a la altura de donde dibuja las letras. Probablemente. en el juego tenía una especie de escalón que le marcaba la zona «segura» (donde se podía desplazar hacia la derecha, para pintar la siguiente letra). ¡Menudo curro!

Atentos a las mayúsculas, minúsculas, 0 vs. O, 1 vs. I, etc… Fue la troleada final a una buena prueba 🙂

BLoCkD3f1nEdPrOt0coL

UPDATE: @navarparty (los primeros en lograr superar este reto) ha publicado su solución (en Go!). Thanks @tatai!
También recomiendo leer el write-up de w0pr y su elegante solución en Python + pygame.

29 de July de 2019

HackIt! 2019, Level 2

Este level parece que se le atragantó a muchos grupos. Aunque estuvimos unas cuantas horas dándole vueltas, una vez resuelto te das cuenta de que, lo que lo hacía complejo, realmente eran varios red-herring o falsas pistas. Si las seguías, estabas muerto. El level empieza con 3 ficheros: yellow, red, green. Aquí está el primer anzuelo: ¿para qué estos colores?… En fin, sacando strings, el que más llama la atención es red.

Juanan-2:2 juanan$ strings -n 12 red|more
Ktablered1000red1000^LCREATE TABLE red1000(redz blob)N
ytablered100red100
CREATE TABLE red100(reda varchar(10),redb varchar(10))H
utablered9red9
CREATE TABLE red9(reda varchar(10),redb varchar(10))H
utablered8red8
CREATE TABLE red8(reda varchar(10),redb varchar(10))H
utablered7red7
CREATE TABLE red7(reda varchar(10),redb varchar(10))H
utablered6red6
CREATE TABLE red6(reda varchar(10),redb varchar(10))H
utablered5red5
CREATE TABLE red5(reda varchar(10),redb varchar(10))H
utablered4red4
...
CREATE TABLE red1(reda varchar(10),redb varchar(10))
0000000 5473 6572 6d34 3352 000a

Vaya… una base de datos, probablemente SQLite. Y el campo redz de la tabla red1000 es de tipo blob. Estuvimos dándole vueltas y vueltas a esto. Conseguimos incluso importar la estructura de las tablas.

En la tabla red1, la columna reda tiene algo:

Pero eso ya salía en los strings, no hacía falta liarse la manta con SQLite… Mmmh, veamos qué significa:

misterio = [0x54,0x73,0x65,0x72,0x6d,0x34,0x33,0x52,0x00,0x0a]
import binascii
print("".join( chr(c) for c in misterio))
Tserm43R

¿Tserm43R? WTF? @ochoto comentó en el grupo que tal vez habría que darle la vuelta a cada par de valores (big endian?) porque los últimos bytes son un salto de línea + fin del string invertidos (0x00, 0x0a). Vamos allá (quitando el salto de línea):

misterio = [0x54,0x73,0x65,0x72,0x6d,0x34,0x33,0x52]
"".join([chr(a)+chr(b) for a,b in [i for i in zip(misterio[1::2], misterio[::2])]])

'sTre4mR3'

Tiene sentido, parece un trozo de string en h4x0r. Dejémoslo ahí y vayamos a por green. Este fue más fácil:

$ binwalk -e green

DECIMAL       HEXADECIMAL     DESCRIPTION
--------------------------------------------------------------------------------
27337196      0x1A121EC       HPACK archive data
33554432      0x2000000       gzip compressed data, has original file name: "trololo", from Unix, last modified: 2019-07-15 23:29:50

$ ls -al _green.extracted/
total 8
drwxr-xr-x   3 juanan  wheel    96 Jul 25 21:28 .
drwxrwxrwt@ 70 root    wheel  2240 Jul 29 22:15 ..
-rw-r--r--   1 juanan  wheel     8 Jul 25 21:28 trololo

$ cat _green.extracted/trololo
ce1VEd!

Vaya, si concatenamos red con green (mismo orden que el enunciado), obtenemos ‘sTre4mR3ce1VEd!’. Tiene muy buena pinta. Sólo nos queda un fichero, yellow. Es un fichero binario, sin ningún magic number ni strings asociados. Tras muchas vueltas, se nos ocurrió algo evidente (a que sí, @navarparty? XDDD), abrirlo con Audacity:

Bingo, se oye a alguien deletreando, en inglés y a mucha velocidad, la parte que nos falta del password. Ajustando la velocidad y teniendo en cuenta que las zonas más oscuras de la señal reflejan mayúsculas, obtenemos R0tT3nB1t.

Así que… R0tT3nB1tsTre4mR3ce1VEd!

Nota: este post no refleja la dificultad de la prueba. No fue «tan fácil» como parece 🙂 Estuvimos muuuuuuuuuucho tiempo analizando los 3 binarios hasta encontrar la secuencia de pasos y herramientas adecuadas.

HackIt! 2019, level 1

Un año más, y van ya 20, asistimos a la Euskal Encounter con ganas de darlo todo, en especial al HackIt! y al CTF. En este HackIt! de la EE27 hemos sudado la gota gorda para pasar 3 pruebas de 6, logrando un segundo puesto, lo cual indica el nivel de dificultad. Eso sí, todas ellas han sido pensadas y muy curradas, por lo que lo primero, como siempre, es agradecer el trabajo de @imobilis y @marcan42. La verdad, sólo imaginar lo que costó implementar alguna de ellas (el level 3 de Minecraft en concreto pudo ser un dolor… o el 6, con la tira de leds y el LFSR en la imagen) hace que quiera invitarles a algo para que vuelvan el año que viene con nuevas ideas 🙂 En fin, entremos en harina, level1, Extreme Simplicity.

Abrimos el código fuente y vemos el siguiente trozo en JS:

function q(e){var a=">,------------------------------------------------------------------------------------[<+>[-]],----------------------------------------------------[<+>[-]],------------------------------------------------------------------------------------------------------------------[<+>[-]],----------------------------------------------------------------------------------------------------------------[<+>[-]],-------------------------------------------------[<+>[-]],--------------------------------------------------------------------------------------------------------------------[<+>[-]],-----------------------------------------------------------------------------------[<+>[-]],-------------------------------------------------------------------[<+>[-]],------------------------------------------------------------------------------------------------------------------[<+>[-]],-------------------------------------------------[<+>[-]],----------------------------------------------------------------------------------------------------------------[<+>[-]],------------------------------------------------------------------------------------[<+>[-]],[<+>[-]][-]+<[>>>++[>+++[>+++++++++++++++++++<-]<-]>>.-------------.-.<<<<[-]<[-]]>[>>>++[>+++[>+++++++++++++++++<-]<-]>>+.[>+>+<<-]>+++++++++++.>--..<----.<<<[-]]";let r=0,f=0;var i=a.length,c=new Uint8Array(3e4),s="",b=10240,k=0;for(r=0;r<i&&!(b<0);r++)switch(b--,a[r]){case">":f++;break;case"<":f>0&&f--;break;case"+":c[f]=c[f]+1&255;break;case"-":c[f]=c[f]-1&255;break;case".":s+=String.fromCharCode(c[f]);break;case",":k>=e.length?c[f]=0:c[f]=e.charCodeAt(k),k++;break;case"[":if(!c[f])for(var t=0;a[++r];)if("["===a[r])t++;else if("]"===a[r]){if(!(t>0))break;t--}break;case"]":if(c[f])for(t=0;a[--r];)if("]"===a[r])t++;else if("["===a[r]){if(!(t>0))break;t--}}return s}
$(function(){$('#password').keyup(function(e){$('#password').css({'background-color':q($('#password').val())});});});

Aquí empezamos a perder el tiempo (business as usual 🙂 debugueando con las DevTools. Creamos un nuevo snippet, pegamos el código, pulsamos en {} para un pretty-print, insertamos una última línea: console.log( q(‘password’) ), metemos un breakpoint en la línea 2 de q() y
ejecutamos la función paso a paso… Bien, así se podría resolver, pero nos llevaría unas horas… Alguien del grupo, con muy buen criterio, no sólo vió que ese código era BrainFuck, sino que pensó que traducirlo a lenguaje C era un buen primer paso. Clonamos este traductor, lo ejecutamos sobre el BrainFuck y obtenemos este sencillo programa.

Si nos fijamos, vemos el código de varios caracteres ASCII (84, 52, 114…), así que, antes de nada, probamos esa secuencia y… ¡Bingo!

file = open("bf.c", "r")
  for line in file:
    match = re.search(r'tape.*-= ([0-9]*)', line)
    if (match):
      if int(match.group(1)) > 13:
        print(chr(int(match.group(1))), end='')

21 de July de 2019

What am I doing with Tracker?

“Colored net”by Chris Vees (priorité maison) is licensed under CC BY-NC-ND 2.0

Some years ago I was asked to come up with some support for sandboxed apps wrt indexed data. This drummed up into Tracker 2.0 and domain ontologies, allowing those sandboxed apps to keep their own private data and collection of Tracker services to populate it.

Fast forward to today and… this is still largely unused, Tracker-using flatpak applications still whitelist org.freedesktop.Tracker, and are thus allowed to read and change content there. Despite I’ve been told it’s been mostly lack of time… I cannot blame them, domain ontologies offer the perfect isolation at the cost of the perfect duplication. It may do the job, but is far from optimal.

So I got asked again “we have a credible story for sandboxed tracker?”. One way or another, seems we don’t, back to the drawing board.

Somehow, the web world seems to share some problems with our case, and seems to handle it with some degree of success. Let’s have a look at some excerpts of the Sparql 1.1 recommendation (emphasis mine):

RDF is often used to represent, among other things, personal information, social networks, metadata about digital artifacts, as well as to provide a means of integration over disparate sources of information.

A Graph Store is a mutable container of RDF graphs managed by a single service. […] named graphs can be added to or deleted from a Graph Store. […] a Graph Store can keep local copies of RDF graphs defined elsewhere […] independently of the original graph.

The execution of a SERVICE pattern may fail due to several reasons: the remote service may be down, the service IRI may not be dereferenceable, or the endpoint may return an error to the query. […] Queries may explicitly allow failed SERVICE requests with the use of the SILENT keyword. […] (SERVICE pattern) results are returned to the federated query processor and are combined with results from the rest of the query.

So according to Sparql 1.1, we have multiple “Graph Stores” that manage multiple RDF graphs. They may federate queries to other endpoints with disparate RDF formats and whose availability may vary. This remote data is transparent, and may be used directly or processed for local storage.

Let’s look back at Tracker, we have a single Graph Store, which really is not that good at graphs. Responsibility of keeping that data updated is spread across multiple services, and ownership of that data is equally scattered.

It snapped me, if we transpose those same concepts from the web to the network of local services that your session is, we can use those same mechanisms to cut a number of drawbacks short:

  • Ownership is clear: If a service wants to store data, it would get its own Graph Store instead of modifying “the one”. Unless explicitly supported, Graph Stores cannot be updated from the outside.
  • So is lifetime: There’s been debate about whether data indexed “in Tracker” is permanent data or a cache. Everyone would get to decide their best fit, unaffected by everyone else’s decisions. The data from tracker-miners would totally be a cache BTW :).
  • Increases trustability: If Graph Stores cannot be tampered with externally, you can trust their content to represent the best effort of their only producer, instead of the minimum common denominator of all services updating “the Graph Store”.
  • Gives a mechanism for data isolation: Graph Stores may choose limiting the number of graphs seen on queries federated from other services.
  • Is sandboxing friendly: From inside a sandbox, you may get limited access to the other endpoints you see, or to the graphs offered. Updates are also limited by nature.
  • But works the same without a sandbox. It also has some benefits, like reducing data duplication, and make for smaller databases.

Domain ontologies from Tracker 2.0 also handle some of those differently, but very very roughly. So the first thing to do to get to that RDF nirvana was muscling up that Sparql support in Tracker, and so I did! I already had some “how could it be possible to do…” plans in my head to tackle most of those, but unfortunately they require changes to the internal storage format.

As it seems the time to do one (FTR, storage format has been “unchanged” since 0.15) I couldn’t just do the bare minimum work, it seemed too much of a good opportunity to miss, instead of maybe making future changes for leftover Sparql 1.1 syntax support.

Things ended up escalating into https://gitlab.gnome.org/GNOME/tracker/commits/wip/carlosg/sparql1.1, where It can be said that Tracker supports 100% of the Sparql 1.1 syntax. No buts, maybe bugs.

Some notable additions are:

  • Graphs are fully supported there, along with all graph management syntax.
  • Support for query federation through SERVICE {}
  • Data dumping through DESCRIBE and CONSTRUCT query forms.
  • Data loading through LOAD update form.
  • The pesky negated property path operator.
  • Support for rdf:langString and rdf:List
  • All missing builtin functions

This is working well, and is almost drop-in (One’s got to mind the graph semantics), making it material for Gnome 3.34 starts to sound realistic.

As Sparql 1.1 is a recommendation finished in 2013, and no other newer versions seem to be in the works, I think it can be said Tracker is reaching maturity. Only HTTP Graph Store Protocol (because why not) remains the big-ish item to reasonably tell we implement all 11 documents. Note that Tracker’s bet for RDF and Sparql started at a time when 1.0 was the current document and 1.1 just an early draft.

And sandboxing support? You might guess already the features it’ll draw from. It’s coming along, actually using Tracker as described above will go a bit deeper than the required query language syntax, more on that when I have the relevant pieces in place. I just thought I’d stop a moment to announce this huge milestone :).

12 de July de 2019

«La película», un relato corto

Rescato de mis archivos una pequeña historia que escribí en marzo de 1998. Tal vez no sea una gran cosa pero creo que el resultado es bonito.


La tarde prometía ser fantástica. Lo prometía el propio ciclo de cine. Mis amigos estaban haciendo un gran trabajo en aquel cineclub y por fin tenía la oportunidad de ver Dune, una de las películas más especiales del cine de ciencia ficción. Un buen momento para pasarlo bien, un buen momento para conocer el mundo un poco más. No llegaba siquiera a los diez y seis años.

Llegué tarde y sólo pude coger sitio atrás. No era lo mejor pero al menos no me quedé fuera como otros. A mi lado estaba sentado Bearn. Éste Bearn era un tipo muy raro. Demasiado para mí en aquel entonces —ahora el freak soy yo— pero no era mal compañero: divertido y a su rollo. Apenas nos tratamos y supongo que su impresión de mí no debía ser demasiado diferente. Poco importa eso al fin y al cabo. Sólo es relevante el hecho de que él fue el único testigo de mi amargura, aquella tarde que aparece hoy en mi memoria.

No sé exactamente cuando ocurrió. Sólo sabría decir que fue en el segundo curso del BUP. No sé ni en qué mes ni en qué estación. Apenas recuerdo al chico, y de ella, sólo una bruma, un deseo, una imagen de beldad liberada de todo defecto. Como todo recuerdo que se precie.

Supongo que no tardé demasiado tiempo en darme cuenta de la extraña familiaridad de la chica que ocupaba la silla frente mí. Esa capacidad de reconocimiento está activada en modo automático en mí desde entonces. Aunque debo señalar al lector que ahora no hay un motivo especial para ello; simplemente es una costumbre y no me daña. Decía pues que pronto reconocí en los rasgos de aquella chica a los de la persona que me enajenaba desde hacía semanas. Meses. Por entonces no sé si ya le había mostrado mi amor —mi primera y única declaración de amor, un triste y solitario te quiero en la puerta del instituto en el único segundo que pude estar a solas con ella— el caso era que estaba allí y yo no estaba prevenido. Mi inquieto ego de amante juvenil se puso nervioso y mi sentido de observación se agudizó hasta la paranoia. Estaba sola. No podía estar sola una chica así. Nunca lo están. Se mueven a la sombra de un macho cuando no protegidas por la invulnerable empalizada de sus amigas. Y ella estaba sola. Mis ojos rastreaban todo el espacio alrededor suyo, sospechando aun del aire que respiraba. Y vinieron a pararse sobre un tipo sentado a su izquierda. Lo conocía muy poco, parecía buena gente y salió un par de veces con una compañera de clase. Un buen tipo que no encajaba en el asunto. Quedé perplejo cuando comprobé que realmente venían juntos. La película hacía minutos que había comenzado.

El desamor en la juventud es algo muy intenso. Está vacío de toda realidad pero luego, con el tiempo, se añorará la pasión. Cuando los años han quemado el alma el desamor solo es amargura que aviva el fuego. En los malditos años de juventud es una virtud heroica. Tan estúpida como todas brillaba igualmente con desgarradora belleza. Aquella tarde, tal vez de invierno, mi corazón saltó en pedazos prendida la mecha con la chispa de dos manos que se cogieron. Y ninguna era mía. Aquella tarde el mundo se me vino encima, en paralelo al viaje iniciático del joven Atreides. Aquella tarde llené los lagos ocultos de Dune con mis lágrimas. Convidado de piedra, con el cielo rozando la punta de mis dedos, viví mi destierro del corazón en el desierto de un planeta desierto que por grande que hubiera sido nunca llenaría la soledad de mi pobre alma autocompadecida. Esta noche la película era otra. El desierto es el mismo.


FidoNET R34: recuperando correo de las áreas de echomail

Esta entrada ha sido originalmente publicada en el foro de esLibre https://charla.eslib.re.


FidoNet logo

Hace a unos pocos meses me propuse recuperar material digital en mis archivos sobre los primeros años de la comunidad Linux en España, en particular mis archivos de FidoNet. Surgió entonces una conversación en Twitter acerca del echomail R34.Linux y de la posibilidad de recuperar correo de aquella época para rescatarlo y republicarlo:

Me pareció maravillosa la iniciativa de Kishpa_, pero al consultar mis datos encontré que en algún momento sufrí un casque de la base de mensajes y perdí todo el correo de ¿cinco años? o más. El súbito recuerdo de aquel día dolió casi tanto como entonces.

GoldED editor

Dado que muchos de los que anduvimos en los albores de HispaLinux nos movimos a partir de FidoNET mi consulta es la siguiente: ¿por alguna casualidad alguien ha superado las vicisitudes de la persistencia de la información a lo largo de las décadas y dispone de sus archivos FidoNET para recuperarlos y republicarlos? No sólo el correo de los áreas R34.Linux y R34.Unix, donde realmente nació todo, sino cualquier otro correo echomail archivado.

En caso positivo hagámoslo llegar a Kishpa_. Es un proyecto bonito de recuperación de memoria digital aunque sólo sea a efectos de archivo.

Venga: ¡todos a arrebuscar en nuestras copias de seguridad!

Copia de la web de HispaLinux en 1998

Aprovechando la sesión de examen de mis archivos noventeros cuelgo en esta web una instantánea de la web de la asociación HispaLinux de marzo de 1998:

GoldED editor

Ya sabéis: no lo hago por nostalgia sino por memoria digital.

02 de July de 2019

Inaugurado http://charla.eslib.re

comunidad esLibre

Tengo el placer de anunciar que ya está levantado un nuevo foro de discusión web para la comunidad esLibre: https://charla.eslib.re. Este es otro paso más promoviendo la regeneración de lo que fue la comunidad HispaLinux en España en un nuevo futuro:

Charla esLibre

Es obvio que hemos elegido Discourse, el mejor software para mantener foros de discusión hoy día. Y además es software libre. Mi agradecimiento a todo su equipo de desarrollo por el maravilloso producto que han creado.

Además están habilitados grupos de charla en Telegram (https://t.me/esLibre) y en Matrix (#esLibre:matrix.org). Ambos grupos están unidos a través de una pasarela Telegram <-> Matrix.

Gracias a los compis que se han encargado de preparar todos los servicios. Sed bienvenidos.

29 de June de 2019

Now I have a web Solid pod

I’ve just created my Solid pod: https://olea.solid.community/.

Tim Berners-Lee proposes Solid as a way to implement his original vision for the World Wide Web. If timbl says something like this then I’m interested:

Within the Solid ecosystem, you decide where you store your data. Photos you take, comments you write, contacts in your address book, calendar events, how many miles you run each day from your fitness tracker… they’re all stored in your Solid POD. This Solid POD can be in your house or workplace, or with an online Solid POD provider of your choice. Since you own your data, you’re free to move it at any time, without interruption of service.

More details are at https://solid.inrupt.com/how-it-works.

I’ve poked just a bit about what Solid can do. Don’t have many time to do now. It’s nice to check how it’s based on linked data, so the potential applications are infinite. And they have a forum too (running Discourse, ♥).

My IT personal strategy requires to implement my own services as much as I can. Solid has a server implementation available I would like to use somewhere in the future.

Love to see the Semantic Web coming back.

28 de June de 2019

Publicado el vídeo de la charla de presentación de 29110_EPF_library en esLibre 2019

29110_EPF_library

Tengo la enorme satisfacción de anunciar que está publicado el vídeo de la conferencia que impartí en esLibre 2019. Todo el mérito es de César García (elsatch) y está publicado en su canal la Hora Maker. Me hace muchísima ilusión y el resultado creo que queda bastante bien. Gracias César.

Las transparencias de la conferencia también están disponibles en esta web.

18 de June de 2019

esLibre 2019: Wikitatón de patrimonio inmueble histórico de Andalucía

congreso esLibre

En la última entrada ya mencioné que impartiré un par de sesiones en el congreso esLibre del próximo viernes en Granada. Esta entrada está dedicada al al taller práctico Wikitatón de patrimonio inmueble histórico de Andalucía: de Andalucía para España y la Humanidad sencillamente para incluir un listado de enlaces y materiales de interés para el taller. La información es muy esquemática porque sólo está pensado ser usada en ese taller. Aquí va.

Referencias oficiales

Principales servicios Wikimedia de nuestro interés

Material relacionado en los proyectos Wikimedia:

Consultas SPARQL a Wikidata relacionadas:

Otros servicios externos de interés:

Ejemplos de monumentos

Usaremos unos ejemplos como material de referencia. Es muy relevante el de la Alhambra porque es la entrada de la guía de Ándalucía con más datos de todo el catálogo, con mucha ventaja.

Alhambra de Granada

Puente del Hacho

Estación de Renfe de Almería

10 de June de 2019

Participación en el congreso esLibre 2019

congreso esLibre

Hace ya un tiempo que conté que había mandado varias propuestas de actividades para el congreso esLibre, donde volveremos a encontrarnos, entre otros, con los viejos amigos de la época de HispaLinux y, espero, con muchísima gente nueva. Finalmente he descartado una de ellas porque el encuentro dura un sólo día y también quería poder asistir a otras charlas y, sobre todo, alternar con los amigotes.

Estas son las dos:

29110_EPF_library

Como ya dije anteriormente estoy muy ilusionado por presentar los trabajos previos de 29110_EPF_library en el que está centrado el TFG que finalmente presentaré en septiembre, justo antes de la LibreOffice Conference que celebraremos en Almería. Me va a ser muy útil para estructurar cómo mejor comunicar los hallazgos del proyecto y con un poco de suerte recibiré alguna realimentación de utilidad para la memoria final.

Sobre el wikitatón: no vamos a tener mucho tiempo disponible para subir muchos resultados. Si ya estás familiarizado con Wikidata será perfecto y para los demás intentaré reducir la barrera de entrada todo lo posible. No tengo pensado preparar mucho más material que algunos enlaces de referencia, incluyendo alguna consulta al fantástico servicio de consultas SPARQL https://query.wikidata.org. Será una sesión muy interactiva y con un poco de suerte estaremos más de uno para echar una mano a los nuevos. Recordad que es MUY IMPORTANTE traer vuestro propio ordenador. Y si estás familiarizado con «linked-data» y en particular con JSON-LD no dejes de venir, porque podemos necesitar tu ayuda ;-)

El programa del congreso ya está publicado y sólo variará en algún ajuste menor: https://eslib.re/2019/programa/

programa del congreso esLibre

Para los interesados: tenemos un grupo Telegram Hispalinustálgicos al que estáis todos invitadísmos.

Espero que os animéis a venir a Granada. Al margen de los contenidos lo mejor es la audiencia que convoca: la flor y nata del linuxerío español. Y por supuesto la propia ciudad de Granada:

«Dale limosna mujer
que no hay en la vida nada
como la pena de ser
ciego en Granada.
»

25 de May de 2019

Eliminar mensajes antiguos en Gmail

Apunto aquí una receta rápida para que no se me olvide y por si fuera de interés para otras personas.

El problema: uso Gmail desde hace muuuchos años y hasta hoy no me había planteado hacer limpieza. Pero, estaba al 93% de ocupación: 17.74 GB usados de 19GB disponibles, así que me he liado la manta a la cabeza y lo he podido bajar un poco, a 14.34GB = 75%, eliminando todos los mensajes de más de 10 años.

93% de utilización = sí, soy un Diógenes del correo

La solución: básicamente, usar este script en Python. Le he metido los import necesarios y lo he dejado en un Gist. Para que el script funcione, hay que activar IMAP en Gmail y (esto es importante), el acceso a aplicaciones «menos seguras» desde las opciones de seguridad de tu cuenta Google.

Activa Less secure app access.

Si tienes 2-Step Verification, tendrás que desactivarlo momentáneamente. Cuando termines de ejecutar el script, recuerda volver a activarlo (y desactiva el Less secure app access).

75% de ocupación, mucho mejor 🙂

05 de May de 2019

Intellectual property registries

Past Monday we Carlos J. Vives and me gave a talk about Creative Commons and open content in a local education center:

The talk is part of the ccALM Almería Creative Commons Festival.

The only goal of this entry is to collect some links to registration services for IP useful for any digital creator in Internet, particularly for open culture works. As far I remember:

In the past there were this Digital Media Rights service, but seems broken now: http://dmrights.com/

Limited to Spain there is two public managed services:

Some of this services are thought to be used as a legal resource in case of litigation. Others are just an historical record for websites. If you want to use any of it study carefully their features and advantages of your interest.

13 de April de 2019

Catrin Labs Computers: Nueva casa

Vaya! La idea de hacer un computador nuevo pero retro ha tenido una excelente recepción, con ésto comenzó a ser un poco más complicado manejar el feedback desde mi cuenta facebook personal, y al mismo tiempo comencé a recibir inquietudes de gente que no habla español. Es por eso que decidí darle mayor infraestructura a […]

Origen

12 de April de 2019

Alborán BBS: ALBINTRO.ZIP

5 de junio de 1996, Alborán BBS fue la segunda BBS almeriense conectada a FidoNet. Tenía su propia demo promocionala: ALBINTRO.ZIP


Ha llovido un poco desde entonces. Incluso en Almería.

11 de April de 2019

An online discussion about free standards

This is not very important post but I want to leave a record about a particular discusion regarding free access to standars. Particularly the ISO/IEC ones. I’ll not share the link neither mention this sir I confronted:

Original question:

Is there any way to access and get ISO standards for free like sci-hub for scholar papers?

After a previous comment this sir answers:

ISO and ASTM only exist through sales of their standards. Imagine you had to work for no money. Would you be happy with this?

And me, well, I can’t restrain myself…

Very happy, yes!

He answers me:

So you can feed, house, and clothe your family with no income whatsoever?

and then answers to another participant:

Because it’s theft. In certain parts of the Middle East, theft is dealt with by Sharia law. This is my last statement in this question.

And my final arguments:

Dear XXXX: there is a bunch of successful standard bodies publishing documents without royalties and no-cost. Say just the very well known W3C, OASIS, IETF, IEEE or OMG. I bet they are able to fund themselves.

With ISO the case can be see as outraging since their norms (or the national equivalents) are frequently mandatory by local laws and paying for reading law is absolutely unfair.

BTW is nice to see you mention Sharia law. This says a lot about yourself.

The Sharia comment shocked me o_0

I can recognize some trolling tone from me I probably should moderate. Anyway I strongly believe in the background arguments: legal and industrial standards should be, at least, open access.

An absolutelly inspiring analysis about what an open standard really is the 2005 Ken Krechmer paper Open Standard Requirments. Particularly I found this table amazing:

Table 1. Creators, implementers and users see openness differently.
stakeholders
Requirements Creator Implementer User
1 Open Meeting
2 Consensus
3 Due Process
4 One World
5 Open IPR
6 Open Change
7 Open Documents
8 Open Interface
9 Open Access
10 On-going Support


The open standards matter has interested me from many years to now (my friends can remember that old SOOS obssesion of mine). I should find the time to do some research about standardization and how applies to software and open development communities.

10 de April de 2019

Participación en podcast de KDE España «Software Libre y la política»

congreso esLibre

Rubén Gómez ha tenido bien a invitarme junto a Adrián Chaves y a Aleix Pol a participar en la sesión dedicada a software libre y política del podcast de KDE España. He aquí el vídeo de la sesión, que a lo tonto acabó durando dos horas:

Se aprovechó la oportunidad para animar a la gente a participar en los próximos encuentros esLibre en Granada el 21 de junio y el congreso internacional de la comunidad LibreOffice LibOCon 2019 que tendrá lugar en nuestra agradable ciudad de Almería del 10 al 13 de septiembre.

Un gran placer compartir el rato con estos amigos.

Muy relacionadas con el tema recupero las transpas de una vieja conferencia que impartí en varias ocasiones: Administración pública, software libre y Revolución Digital:

Administración pública, software libre y Revolución Digital

Y para acabar, Rubén ha aprovechado la oportunidad para recordarme la entrevistilla que me hicieron en las visita a las minas de Rodalquilar a la sazón del congreso Akademy 2017 que tuvimos el honor de hospedar también en nuestra mediterránea ciudad:

09 de April de 2019

Novedades recientes

Unas cuantas novedades hoy.

congreso esLibre

Por un lado han sido aceptadas mis actividades propuestas para el congreso esLibre:

Tengo muchas ganas de presentar 29100_EPF_library que es el grueso del TFG que defenderé este verano.

Y recordar que la llamada a la participación sigue abierta: aprovechad y nos veremos en Granada.

congreso esLibre

Y novedades sobre el encuentro anual internacional de LibreOffice que celebraremos en Almería en septiembre:

También estamos prepando las actividades en mayo del Festival ccALM Creative Commons Almería 2019 y con seguridad volveremos hacer wikicositas y mapeos guapos. Anímate y trae tus propias propuestas.

Y finalmente la cafrada de la temporada: he creado un bonito repositorio git para alojar el histórico de mi actividad en Google (Google Takeout). Aún está por ver si sirve para algo :-)

08 de April de 2019

Catrin Labs Computers: CLC-88 Micro de 8 bits

Ya entrando en terreno es hora de hacer algunas definiciones para el computador de 8-bits. Una de las decisiones más difíciles es el procesador, ya que cualquier elección deja inmediatamente fuera al 50% de la población. La mayoría de los programadores dominan sólo uno de los dos, el 6502 o el Z-80 dependiendo de la […]

Origen

07 de April de 2019

Catrin Labs Computers: Chips de Video

La parte que no es tan obvia en cuanto a hardware en este proyecto es el chip de video. Si pensamos en usar un chip de video de la época necesariamente nos casaríamos con un tipo de estética, ya sea el attribute clash de Spectrum o los 4 colores por línea del Atari, o la […]

Origen

Catrin Labs Computers

Hace un tiempo atrás 8-bit guy presentó una idea maravillosa, y es diseñar un computador antiguo pero usando tecnología moderna. Es justamente un tema al que le había estado dando vueltas hace tiempo – desde que me puse a programar Prince of Persia para Atari probablemente – y cuando comencé a ver el video sobre […]

Origen

01 de April de 2019

Propuestas para el encuentro esLibre 2019

congreso esLibre

Ya está en marcha el nuevo congreso esLibre, encuentro nacional sobre tecnologías y conocimiento libre que tendrá lugar el próximo 21 de junio en la ciudad que me vio nacer: Granada. y he aprovechado la llamada a la participación para responder con tres propuestas. Veremos si la organización tiene a bien aprobarlas. El fin de esta entrada es simplemente darles eco a las mismas.


Título: HackLab Almería, un modelo de dinamización tecnológica hiperlocal

Formato: charla

Descripción:

Retrospectiva, desde un punto de vista personal, de la experiencia de un modelo de dinamización de guerrilla en provincias en lo denominamos HackLab Almería dedicado a promover la tecnología y conocimiento, especialmente abiertos/libres. Se proporcionarán algunas métricas y lamentaciones.

Una versión previa de las transparencias está disponible en http://olea.org/conferencias/doc-conf-20171107-CubaConf/.


Título: 29110_EPF_library: hacía un cuerpo de conocimiento abierto de prácticas de desarrollo de software adecuadas para muy pequeñas organizaciones

Formato: charla o charla relámpago

Descripción:

Las organizaciones muy pequeñas dedicadas al desarrollo de software tienen un gran problema al querer formalizar controlar la calidad de sus prácticas. Como alternativa pragmática se ha propuesto la familia de normas ISO 29110 para resolver sus trabas.

Haciendo suyos esos fines la iniciativa 29110_EPF_library se ha propuesto como objetivos:

  • formal modeled repository of 29110 processes and related information;
  • development and tailoring framework for 29110 processes adoption;
  • low adoption barriers for VSEs:
    • the opensource library licensing frees from royalties or restrictive use of IP;
    • using opensource tools reduces the costs of acquisition;
  • open community development;
  • acts as a body of knowledge (BoK) for 29110 related content in particular an for software and systems engineering in general.

Puede consultarse una versión previa de 29110_EPF_library en http://olea.org/tmp/Deploy-Pack-29110-EPF_library/


Título: Wikitatón de patrimonio inmueble histórico de Andalucía: de Andalucía para España y la Humanidad

Formato: taller

Descripción:

Taller práctico donde todos trabajaremos para ampliar el registro en los diferentes proyectos de la familia Wikipedia de los elementos registrados en el catálogo oficial andaluz con tareas como:

  • crear y detallar entradas en Wikidata de los diferentes ítemes;
  • localizar, contribuir, anotar y georreferenciar material fotográfico relacionado;
  • ídem con entradas en Wikipedia, con preferencia por las lenguas ibéricas e inglés.

Nos vemos en Granada.

27 de March de 2019

Postfix: Name service error for name=domain.com type=MX: Host not found, try again

I tried to post this in Serverfault but I couldn’t since it’s blocked by their spam detector.

Here is the full text of my question:


Hi:

I’m stuck with a Postfix MX related problem.

I’ve just migrated a very old Centos 5 server to v7 so I’m using postfix-2.10.1-7.el7.x86_64. I’ve upgraded the legacy postfix configuration (maybe the cause of this hell) and other supplementary stuff which seems to work:

  • postfix-perl-scripts-2.10.1-7.el7.x86_64
  • postgrey-1.34-12.el7.noarch
  • amavisd-new-2.11.1-1.el7.noarch
  • spamassassin-3.4.0-4.el7_5.x86_64
  • perl-Mail-SPF-2.8.0-4.el7.noarch
  • perl-Mail-DKIM-0.39-8.el7.noarch
  • dovecot-2.2.36-3.el7.x86_64

After many tribulations I think I got most of the system running except the annoying MX related problems, as (from /var/log/maillog):

Mar 28 14:26:48 tormento postfix/smtpd[1021]: warning: Unable to look up MX host for spmailtechn.com: Host not found, try again
Mar 28 14:26:51 tormento postfix/smtpd[1052]: warning: Unable to look up MX host for inlumine.ual.es: Host not found, try again
Mar 28 14:31:38 tormento postfix/smtpd[1442]: warning: Unable to look up MX host for aol.com: Host not found, try again
Mar 28 13:07:53 tormento postfix/smtpd[26556]: warning: Unable to look up MX host for hotmail.com: Host not found, try again
Mar 28 13:12:06 tormento postfix/smtpd[26650]: warning: Unable to look up MX host for facebookmail.com: Host not found, try again
Mar 28 13:12:31 tormento postfix/smtpd[26650]: warning: Unable to look up MX host for joker.com: Host not found, try again
Mar 28 13:13:02 tormento postfix/smtpd[26650]: warning: Unable to look up MX host for bounce.linkedin.com: Host not found, try again

and:

Mar 28 14:50:36 tormento postfix/smtp[1700]: 7B6C69C6A2: to=<ismael.olea@gmail.com>, orig_to=<ismael@olea.org>, relay=none, delay=1142, delays=1142/0.07/0/0, dsn=4.4.3, status=deferred (Host or domain name not found. Name service error for name=gmail.com type=MX: Host not found, try again)
Mar 28 14:32:05 tormento postfix/smtp[1383]: 721A19C688: to=<XXXXX@yahoo.com>, orig_to=<XXXX@olea.org>, relay=none, delay=4742, delays=4742/0/0/0, dsn=4.4.3, status=deferred (Host or domain name not found. Name service error for name=yahoo.com type=MX: Host not found, try again)

as examples.

The first suspect is DNS resolution but this is working both using Hetztner DNS servers (where machine is host) or 8.8.8.8 or 9.9.9.9:

$ dig mx gmail.com

; <<>> DiG 9.9.4-RedHat-9.9.4-73.el7_6 <<>> mx gmail.com
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 20330
;; flags: qr rd ra; QUERY: 1, ANSWER: 5, AUTHORITY: 0, ADDITIONAL: 1

;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 4096
;; QUESTION SECTION:
;gmail.com.			IN	MX

;; ANSWER SECTION:
gmail.com.		3014	IN	MX	10 alt1.gmail-smtp-in.l.google.com.
gmail.com.		3014	IN	MX	5 gmail-smtp-in.l.google.com.
gmail.com.		3014	IN	MX	40 alt4.gmail-smtp-in.l.google.com.
gmail.com.		3014	IN	MX	20 alt2.gmail-smtp-in.l.google.com.
gmail.com.		3014	IN	MX	30 alt3.gmail-smtp-in.l.google.com.

;; Query time: 1 msec
;; SERVER: 213.133.100.100#53(213.133.100.100)
;; WHEN: jue mar 28 14:56:00 CET 2019
;; MSG SIZE  rcvd: 161

or:


dig mx  inlumine.ual.es

; <<>> DiG 9.9.4-RedHat-9.9.4-73.el7_6 <<>> mx inlumine.ual.es
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 38239
;; flags: qr rd ra; QUERY: 1, ANSWER: 5, AUTHORITY: 2, ADDITIONAL: 1

;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 4096
;; QUESTION SECTION:
;inlumine.ual.es.		IN	MX

;; ANSWER SECTION:
inlumine.ual.es.	172800	IN	MX	1 ASPMX.L.GOOGLE.COM.
inlumine.ual.es.	172800	IN	MX	10 ASPMX3.GOOGLEMAIL.COM.
inlumine.ual.es.	172800	IN	MX	10 ASPMX2.GOOGLEMAIL.COM.
inlumine.ual.es.	172800	IN	MX	5 ALT1.ASPMX.L.GOOGLE.COM.
inlumine.ual.es.	172800	IN	MX	5 ALT2.ASPMX.L.GOOGLE.COM.

;; AUTHORITY SECTION:
inlumine.ual.es.	172800	IN	NS	dns.ual.es.
inlumine.ual.es.	172800	IN	NS	alboran.ual.es.

;; Query time: 113 msec
;; SERVER: 213.133.100.100#53(213.133.100.100)
;; WHEN: jue mar 28 14:56:51 CET 2019
;; MSG SIZE  rcvd: 217

my main.cf:

$ postconf -n
address_verify_sender = postmaster@olea.org
alias_database = hash:/etc/aliases
alias_maps = hash:/etc/aliases
body_checks = regexp:/etc/postfix/body_checks.regexp
broken_sasl_auth_clients = yes
canonical_maps = hash:/etc/postfix/canonical
command_directory = /usr/sbin
config_directory = /etc/postfix
content_filter = smtp-amavis:[127.0.0.1]:10024
daemon_directory = /usr/libexec/postfix
data_directory = /var/lib/postfix
debug_peer_level = 2
debugger_command = PATH=/bin:/usr/bin:/usr/local/bin:/usr/X11R6/bin ddd $daemon_directory/$process_name $process_id & sleep 5
header_checks = pcre:/etc/postfix/header_checks.pcre
home_mailbox = Maildir/
html_directory = no
inet_interfaces = all
inet_protocols = ipv4
local_recipient_maps = proxy:unix:passwd.byname $alias_maps
mail_owner = postfix
mailbox_command = /usr/bin/procmail -a "$EXTENSION"
mailbox_size_limit = 200000000
mailq_path = /usr/bin/mailq.postfix
manpage_directory = /usr/share/man
message_size_limit = 30000000
mydestination = $myhostname, localhost.$mydomain, localhost, $mydomain, tormento.olea.org, /etc/postfix/localdomains
myhostname = tormento.olea.org
newaliases_path = /usr/bin/newaliases.postfix
policy_time_limit = 3600
queue_directory = /var/spool/postfix
readme_directory = /usr/share/doc/postfix-2.10.1/README_FILES
recipient_delimiter = +
sample_directory = /usr/share/doc/postfix-2.10.1/samples
sendmail_path = /usr/sbin/sendmail.postfix
setgid_group = postdrop
smtp_tls_cert_file = /etc/pki/tls/certs/tormento.olea.org.crt.pem
smtp_tls_key_file = /etc/pki/tls/private/tormento.olea.org.key.pem
smtp_tls_mandatory_protocols = !SSLv2,!SSLv3
smtp_tls_note_starttls_offer = yes
smtp_tls_security_level = may
smtpd_helo_required = yes
smtpd_recipient_restrictions = permit_mynetworks check_client_access hash:/etc/postfix/access permit_sasl_authenticated reject_non_fqdn_recipient reject_non_fqdn_sender reject_rbl_client cbl.abuseat.org reject_rbl_client dnsbl-1.uceprotect.net reject_rbl_client zen.spamhaus.org reject_unauth_destination check_recipient_access hash:/etc/postfix/roleaccount_exceptions reject_multi_recipient_bounce check_helo_access pcre:/etc/postfix/helo_checks.pcre reject_non_fqdn_hostname reject_invalid_hostname check_sender_mx_access cidr:/etc/postfix/bogus_mx.cidr check_sender_access hash:/etc/postfix/rhsbl_sender_exceptions check_policy_service unix:postgrey/socket permit
smtpd_sasl_auth_enable = yes
smtpd_sasl_local_domain = $myhostname, olea.org, cacharreo.club
smtpd_sasl_path = private/auth
smtpd_sasl_security_options = noanonymous
smtpd_sasl_type = dovecot
smtpd_tls_auth_only = no
smtpd_tls_cert_file = /etc/pki/tls/certs/tormento.olea.org.crt.pem
smtpd_tls_key_file = /etc/pki/tls/private/tormento.olea.org.key.pem
smtpd_tls_loglevel = 1
smtpd_tls_mandatory_protocols = TLSv1
smtpd_tls_received_header = yes
smtpd_tls_security_level = may
smtpd_tls_session_cache_timeout = 3600s
tls_random_source = dev:/dev/urandom
transport_maps = hash:/etc/postfix/transport
unknown_local_recipient_reject_code = 550
virtual_maps = hash:/etc/postfix/virtual

and my master.cf:

$ postconf -M
smtp       inet  n       -       n       -       -       smtpd
submission inet  n       -       n       -       -       smtpd -o smtpd_tls_security_level=may -o smtpd_sasl_auth_enable=yes -o cleanup_service_name=cleanup_submission -o content_filter=smtp-amavis:[127.0.0.1]:10023
smtps      inet  n       -       n       -       -       smtpd -o smtpd_tls_wrappermode=yes -o smtpd_sasl_auth_enable=yes
pickup     unix  n       -       n       60      1       pickup
cleanup    unix  n       -       n       -       0       cleanup
qmgr       unix  n       -       n       300     1       qmgr
tlsmgr     unix  -       -       n       1000?   1       tlsmgr
rewrite    unix  -       -       n       -       -       trivial-rewrite
bounce     unix  -       -       n       -       0       bounce
defer      unix  -       -       n       -       0       bounce
trace      unix  -       -       n       -       0       bounce
verify     unix  -       -       n       -       1       verify
flush      unix  n       -       n       1000?   0       flush
proxymap   unix  -       -       n       -       -       proxymap
proxywrite unix  -       -       n       -       1       proxymap
smtp       unix  -       -       n       -       -       smtp
relay      unix  -       -       n       -       -       smtp -o fallback_relay=
showq      unix  n       -       n       -       -       showq
error      unix  -       -       n       -       -       error
retry      unix  -       -       n       -       -       error
discard    unix  -       -       n       -       -       discard
local      unix  -       n       n       -       -       local
virtual    unix  -       n       n       -       -       virtual
lmtp       unix  -       -       n       -       -       lmtp
anvil      unix  -       -       n       -       1       anvil
scache     unix  -       -       n       -       1       scache
smtp-amavis unix -       -       n       -       2       smtp -o smtp_data_done_timeout=1200 -o smtp_send_xforward_command=yes -o disable_dns_lookups=yes -o max_use=20
127.0.0.1:10025 inet n   -       n       -       -       smtpd -o content_filter= -o local_recipient_maps= -o relay_recipient_maps= -o smtpd_restriction_classes= -o smtpd_delay_reject=no -o smtpd_client_restrictions=permit_mynetworks,reject -o smtpd_helo_restrictions= -o smtpd_sender_restrictions= -o smtpd_recipient_restrictions=permit_mynetworks,reject -o mynetworks_style=host -o mynetworks=127.0.0.0/8 -o strict_rfc821_envelopes=yes -o smtpd_error_sleep_time=0 -o smtpd_soft_error_limit=1001 -o smtpd_hard_error_limit=1000 -o smtpd_client_connection_count_limit=0 -o smtpd_client_connection_rate_limit=0 -o receive_override_options=no_header_body_checks,no_unknown_recipient_checks
policy     unix  -       n       n       -       2       spawn user=nobody argv=/usr/bin/perl /usr/share/postfix/policyd-spf-perl

I fear I’m missing something really obvious but I’ve been googling for two days doing any amount of tests and now I don’t know what much to do.

Thanks in advance.


Post data:

Well, this is embarrassing. As I predicted my problem was caused by the most obvious and trivial reason: lack of read access to /etc/resolv.conf for the postfix user o_0

As you probably know the postfix subproceses (smtp, smtpd, qmgr, etc) runs with the postfix user. All the comments and suggestion I’ve received has been related with problems accessing to DNS resolving data and the usual suspects has been SELinux or a chrooted postfix. You all were right in the final reason. Following an advice and tried:

# sudo -u postfix -H cat /etc/resolv.conf
cat: /etc/resolv.conf: Permission denied

So… What??

# ls -l /etc/resolv.conf
-rw-r-----. 1 root named 118 mar 28 20:34 /etc/resolv.conf

OMG!… then after a chmod o+r and restarting Postfix all the email on hold can be processed and sent and new mail is processed as expected.

I doubt I’ve changed the resolv.conf reading permissions but I can’t be 100% sure. So finally the problem is fixed and I’m very sorry for stole the attention of all of you for this ridiculous reason.

Thanks you all.

31 de January de 2019

A mutter and gnome-shell update

Some personal highlights:

Emoji OSK

The reworked OSK was featured a couple of cycles ago, but a notable thing that was still missing from the design reference was emoji input.

No more, sitting in a branch as of yet:

This UI feeds from the same emoji list than GtkEmojiChooser, and applies the same categorization/grouping, all the additional variants to an emoji are available as a popover. There’s also a (less catchy) keypad UI in place, ultimately hooked to applications through the GtkInputPurpose.

I do expect this to be in place for 3.32 for the Wayland session.

X11 vs Wayland

Ever since the wayland work started on mutter, there’s been ideas and talks about how mutter “core” should become detached of X11 code. It has been a long and slow process, every design decision has been directed towards this goal, we leaped forward on 2017 GSOC, and eg. Georges sums up some of his own recent work in this area.

For me it started with a “Hey, I think we are not that far off” comment in #gnome-shell earlier this cycle. Famous last words. After rewriting several, many, seemingly unrelated subsystems, and shuffling things here and there, and there we are to a point where gnome-shell might run with --no-x11 set. A little push more and we will be able to launch mutter as a pure wayland compositor that just spawns Xwayland on demand.

What’s after that? It’s certainly an important milestone but by no means we are done here. Also, gnome-settings-daemon consists for the most part X11 clients, which spoils the fun by requiring Xwayland very early in a real session, guess what’s next!

At the moment about 80% of the patches have been merged. I cannot assure at this point will all be in place for 3.32, but 3.34 most surely. But here’s a small yet extreme proof of work:

Performance

It’s been nice to see some of the performance improvements I did last cycle being finally merged. Some notable ones, like that one that stopped triggering full surface redraws on every surface invalidation. Also managed to get some blocking operations out of the main loop, which should fix many of the seemingly random stalls some people were seeing.

Those are already in 3.31.x, with many other nice fixes in this area from Georges, Daniel Van Vugt et al.

Fosdem

As a minor note, I will be attending Fosdem and the GTK+ Hackfest happening right after. Feel free to say hi or find Wally, whatever comes first.

29 de January de 2019

Working on the Chromium Servicification Project

Igalia & ChromiumIt’s been a few months already since I (re)joined Igalia as part of its Chromium team and I couldn’t be happier about it: right since the very first day, I felt perfectly integrated as part of the team that I’d be part of and quickly started making my way through the -fully upstream- project that would keep me busy during the following months: the Chromium Servicification Project.

But what is this “Chromium servicification project“? Well, according to the Wiktionary the word “servicification” means, applied to computing, “the migration from monolithic legacy applications to service-based components and solutions”, which is exactly what this project is about: as described in the Chromium servicification project’s website, the whole purpose behind this idea is “to migrate the code base to a more modular, service-oriented architecture”, in order to “produce reusable and decoupled components while also reducing duplication”.

Doing so would not only make Chromium a more manageable project from a source code-related point of view and create better and more stable interfaces to embed chromium from different projects, but should also enable teams to experiment with new features by combining these services in different ways, as well as to ship different products based in Chromium without having to bundle the whole world just to provide a particular set of features. 

For instance, as Camille Lamy put it in the talk delivered (slides here) during the latest Web Engines Hackfest,  “it might be interesting long term that the user only downloads the bits of the app they need so, for instance, if you have a very low-end phone, support for VR is probably not very useful for you”. This is of course not the current status of things yet (right now everything is bundled into a big executable), but it’s still a good way to visualise where this idea of moving to a services-oriented architecture should take us in the long run.

Chromium Servicification Layers

With this in mind, the idea behind this project would be to work on the migration of the different parts of Chromium depending on those components that are being converted into services, which would be part of a “foundation” base layer providing the core services that any application, framework or runtime build on top of chromium would need.

As you can imagine, the whole idea of refactoring such an enormous code base like Chromium’s is daunting and a lot of work, especially considering that currently ongoing efforts can’t simply be stopped just to perform this migration, and that is where our focus is currently aimed at: we integrate with different teams from the Chromium project working on the migration of those components into services, and we make sure that the clients of their old APIs move away from them and use the new services’ APIs instead, while keeping everything running normally in the meantime.

At the beginning, we started working on the migration to the Network Service (which allows to run Chromium’s network stack even without a browser) and managed to get it shipped in Chromium Beta by early October already, which was a pretty big deal as far as I understand. In my particular case, that stage was a very short ride since such migration was nearly done by the time I joined Igalia, but still something worth mentioning due to the impact it had in the project, for extra context.

After that, our team started working on the migration of the Identity service, where the main idea is to encapsulate the functionality of accessing the user’s identities right through this service, so that one day this logic can be run outside of the browser process. One interesting bit about this migration is that this particular functionality (largely implemented inside the sign-in component) has historically been located quite high up in the stack, and yet it’s now being pushed all the way down into that “foundation” base layer, as a core service. That’s probably one of the factors contributing to making this migration quite complicated, but everyone involved is being very dedicated and has been very helpful so far, so I’m confident we’ll get there in a reasonable time frame.

If you’re curious enough, though, you can check this status report for the Identity service, where you can see the evolution of this particular migration, along with the impact our team had since we started working on this part, back on early October. There are more reports and more information in the mailing list for the Identity service, so feel free to check it out and/or subscribe there if you like.

One clarification is needed, tough: for now, the scope of this migrations is focused on using the public C++ APIs that such services expose (see //services/<service_name>/public/cpp), but in the long run the idea is that those services will also provide Mojo interfaces. That will enable using their functionality regardless of whether you’re running those services as part of the browser’s process, or inside their own & separate processes, which will then allow the flexibility that chromium will need to run smoothly and safely in different kind of environments, from the least constrained ones to others with a less favourable set of resources at their disposal.

And this is it for now, I think. I was really looking forward to writing a status update about what I’ve been up to in the past months and here it is, even though it’s not the shortest of all reports.

FOSDEM 2019

One last thing, though: as usual, I’m going to FOSDEM this year as well, along with a bunch of colleagues & friends from Igalia, so please feel free to drop me/us a line if you want to chat and/or hangout, either to talk about work-related matters or anything else really.

And, of course, I’d be also more than happy to talk about any of the open job positions at Igalia, should you consider applying. There are quite a few of them available at the moment for all kind of things (most of them available for remote work): from more technical roles such as graphicscompilersmultimedia, JavaScript engines, browsers (WebKitChromium, Web Platform) or systems administration (this one not available for remotes, though), to other less “hands-on” types of roles like developer advocatesales engineer or project manager, so it’s possible there’s something interesting for you if you’re considering to join such an special company like this one.

See you in FOSDEM!

23 de January de 2019

WORA-WNLF


I started my career writing web applications. I had struggles with PHP web-frameworks, javascript libraries, and rendering differences (CSS and non-CSS glitches) across browsers. After leaving that world, I started focusing more on the backend side of things, fleeing from the frontend camp (mainly actually just scared of that abomination that was javascript; because, in my spare time, I still did things with frontends: I hacked on a GTK media player called Banshee and a GTK chat app called Smuxi).

So there you had me: a backend dev by day, desktop dev by night. But in the GTK world I had similar struggles as the ones I had as a frontend dev when the browsers wouldn’t behave in the same way. I’m talking about GTK bugs in other non-Linux OSs, i.e. Mac and Windows.

See, I wanted to bring a desktop app to the masses, but these problems (and others of different kinds) prevented me to do it. And while all this was happening, another major shift was happening as well: desktop environments were fading while mobile (and not so mobile: tablets!) platforms were rising in usage. This meant yet more platforms that I wished GTK supported. As I’m not a C language expert (nor I wanted to be), I kept googling for the terms “gtk” and “android” or “gtk” and “iOS”, to see if some hacker put something together that I could use. But that day never happened.

Plus, I started noticing a trend: big companies with important mobile apps started to stop using HTML5 within their apps in favour of native apps, mainly chasing the “native look & feel”. This meant, clearly, that even if someone cooked a hack that made gtk+ run in Android, it would still feel foreign, and nobody would dare to use it.

So I started to become a fan of abstraction layers that were a common denominator of different native toolkits and kept their native look&feel. For example, XWT, the widget toolkit that Mono uses in MonoDevelop to target all 3 toolkits depending on the platform: Cocoa (on macOS), Gtk (on Linux) and WPF (on Windows). Pretty cool hack if you ask me. But using this would contradict my desires of using a toolkit that would already support Android!

And there it was Xamarin.Forms, an abstraction layer between iOS, Android and WindowsPhone, but that didn’t support desktops. Plus, at the time, Xamarin was proprietary (and I didn’t want to get out of my open source world). It was a big dilemma.

But then, some years passed, and many events happened around Xamarin.Forms:
  • Xamarin (the company) was bought by Microsoft and, at the same time, Xamarin (the product) was open sourced.
  • Xamarin.Forms is opensource now (TBH not sure if it was proprietary before, or it was always opensource).
  • Xamarin.Forms started supporting macOS and Windows UWP.
  • Xamarin.Forms 3.0 included support for GTK and WPF.

So that was the last straw that made me switch completely all my desktop efforts toward Xamarin.Forms. Not only I can still target Linux+GTK (my favorite platform), I can also make my apps run in mobile platforms, and desktop OSs that most people use. So both my niche and mainstream covered! But this is not the end: Xamarin.Forms has been recently ported to Tizen too! (A Linux-based OS used by Samsung in SmartTVs and watches.)

Now let me ask you something. Do you know of any graphical toolkit that allows you to target 6 different platforms with the same codebase? I repeat: Linux(GTK), Windows(UWP/WPF), macOS, iOS, Android, Tizen. The old Java saying is finally here! (but for the frontend side): “write once, run anywhere” (WORA) to which I add “with native look’n’feel” (WORA-WNLF)

If you want to know who is the hero that made the GTK driver of Xamarin.Forms, follow @jsuarezruiz which BTW has been recently hired by Microsoft to work on their non-Windows IDE ;-)

PS: If you like .NET and GTK, my employer is also hiring! (remote positions might be available too) ping me 

11 de January de 2019

«de Par en Par»: hacia un encuentro nacional de comunidad en tecnología y procomunes

Primero una breve nota de descargo: hace unos años, conforme retomaba vida activa decidí escribir más, pero sólo cuando podía aportar contenidos sustantivos. Pero la escasez de mi escritura durante 2018 hasta hoy mismo se ha debido por la falta de tiempo: muchísimo trabajo hecho, parte del cual habría merecido más eco en este medio. En esta entrada sólo quiero expresar algunos pensamientos relacionados con un evento nacional que afortunadamente se está constituyendo mientras escribo estas líneas. Me encantaría que se acabase llamando «de Par en Par».

Reinicio de la comunidad HispaLinux

decadente HispaLinuxUn rápido antecedente: en un arrebato de nostalgia e ilusión y de forma espontánea tras un encuentro puntual a finales de enero a @SorayaMuoz y a @Juantomas les dio por querer celebrar el 20 aniversario de la fundación de la asociación HispaLinux (algo tarde, puesto que acaba de cumplir 21) y convocar a los viejos amigos y compañeros de batalla de aquella época. Es triste reconocerlo pero uno ya habla de los recuerdos como el señor mayor que nunca te habías imaginado serías.

ficha de registro de la asociación HispaLinux

jóvenes de casi 50 años recordando aventuritas Lo importante es que es que convocaron a algunos amigos, crearon un grupo Telegram y arramblaron con las correspondientes agendas de contactos para traer a toda la peña de por entonces. En un par de días ya éramos más de 100 miembros y subiendo. Y nos sigue faltando gente, #OjO. Inmediatamente se planteó alguna clase de encuentro y cual caprae in monte surgen propuestas. Una de ellas, la de interés en este artículo, sería un futuro, deseable, encuentro tecnológico heredero de los añorados congresos HispaLinux. Hasta ahora se están configurando dos propuestas y pronto sabremos más.

Denominación del encuentro

Con la vista puesta en futuras repeticiones anuales quisiera proponer un nombre nuevo: «de Par en Par». ¿Por qué? En muchos sentidos estamos recuperando el espíritu, comunidad y valores de los viejos congresos HispaLinux que por entonces sirvieron de revulsivo para una comunidad ávida e inquieta y antecedido la actual abundante dinámica de encuentros y congresos tecnológicos por toda España. Yo mismo fui promotor, colaborador u organizador de aquellos encuentros.

Hoy por tanto parecería muy apropiado recuperar aquel nombre. Personalmente creo que ya no es adecuado ni conveniente:

  • la denominación HispaLinux está quemada, la última temporada activa de la asociación de la que tomó el nombre el congreso estuvo marcada por un declive a la vista de todos, algo normal aunque razonable dentro de la dinámica asociacionista;

  • sin embargo la ¿última? junta directiva se hizo cargo precipitó a la asociación hacia la absoluta irrelevancia, incluso cerrando servicios disponibles a los socios y, lo más terrible, interrumpiendo la comunicación con el grueso de los asociados y por extensión la desaparición de la actividad de representación democrática, sin asambleas generales públicamente conocidas ni otras acciones relevantes; en mi opinión lo más sensato es alejarse de esas personas y en mi fuero interno sólo desearía castigarlos con el látigo de la indiferencia;

  • en mi opinión la marca «Linux» ya no tiene la fuerza e impacto de, especialmente, la primera década de los 2000; en cambio se ha «commoditizado»: ya no puede parecer tan sectario para algunos como lo fue en el pasado, está tan ampliamente adoptado en la industria que prácticamente tales tecnologías se dan por supuestas en la mayoría de los ámbitos de las TIC; y esto no es menos que maravilloso, pero como marca o denominación ya no le observo el gancho rupturista del pasado;

  • además, la evolución de las comunidades FLOSS ya va mucho más lejos que los ámbitos de los sistemas Linux, los sistemas operativos, la comunidad GNU, etc, etc: no sólo hay cantidades ingentes de productos software libres que corren en otros sistemas (Android, Windows, iPhone…), que están apadrinadas en comunidades estrictamente no relacionadas con Linux (algunas inmensas como Apache o Eclipse), otras en pleno auge, transversales, alrededor de lenguajes y marcos de programación sino que excede al mundo del software a los contenidos libres y abiertos que van desde Wikipedia, Creative Commons, OpenStreetMap, modelos 3D… hasta las cada vez más abundantes fuentes de datos abiertos;

  • y en la propia evolución de la actividad de HispaLinux ya hubo un cambio de foco importantísimo: hacia la protección de los derechos digitales y los marcos legales para la sociedad digital tanto para construir un patrimonio común de software como para alimentar procomunes inmateriales como la innovación (ejemplo: lucha contra las patentes software), la seguridad en las TIC, la privacidad, el anónimato personales en Internet, etc;

  • finalmente, la denominación HispaLinux es muy reconocida e incluso querida por quienes vivimos aquellos tiempos más intensamente… y ya no somos los más jóvenes pero ¿sirve de atractivo para los demás?; sin querer renunciar a este nuestro público propongo abrirse a todo el público actual, más grande, preparado y diverso que nunca.

¿Por qué «de Par en Par»?

  • porque es un encuentro de la comunidad, por la comunidad y para la comunidad;

  • porque la mejor meritocracia del hackerismo se basa en la igualdad y así nos relacionamos, así colaboramos: entre pares;

  • y porque de par en para es estar abierto: abiertos a los marcos de propiedad intelectual y reúso que creemos justos e imprescindibles para la sociedad digital actual, abiertos a todos los productos digitales e intelectuales creados en dichos marcos y porque como comunidad estamos abiertos a nuevas incorporaciones: no queremos cooptación, eres uno más porque lo deseas.

Somos iguales. Transversales. Todo es abierto. Vivimos de par en par.

de Par en Par

08 de January de 2019

Epiphany automation mode

Last week I finally found some time to add the automation mode to Epiphany, that allows to run automated tests using WebDriver. It’s important to note that the automation mode is not expected to be used by users or applications to control the browser remotely, but only by WebDriver automated tests. For that reason, the automation mode is incompatible with a primary user profile. There are a few other things affected by the auotmation mode:

  • There’s no persistency. A private profile is created in tmp and only ephemeral web contexts are used.
  • URL entry is not editable, since users are not expected to interact with the browser.
  • An info bar is shown to notify the user that the browser is being controlled by automation.
  • The window decoration is orange to make it even clearer that the browser is running in automation mode.

So, how can I write tests to be run in Epiphany? First, you need to install a recently enough selenium. For now, only the python API is supported. Selenium doesn’t have an Epiphany driver, but the WebKitGTK driver can be used with any WebKitGTK+ based browser, by providing the browser information as part of session capabilities.

from selenium import webdriver

options = webdriver.WebKitGTKOptions()
options.binary_location = 'epiphany'
options.add_argument('--automation-mode')
options.set_capability('browserName', 'Epiphany')
options.set_capability('version', '3.31.4')

ephy = webdriver.WebKitGTK(options=options, desired_capabilities={})
ephy.get('http://www.webkitgtk.org')
ephy.quit()

This is a very simple example that just opens Epiphany in automation mode, loads http://www.webkitgtk.org and closes Epiphany. A few comments about the example:

  • Version 3.31.4 will be the first one including the automation mode.
  • The parameter desired_capabilities shouldn’t be needed, but there’s a bug in selenium that has been fixed very recently.
  • WebKitGTKOptions.set_capability was added in selenium 3.14, if you have an older version you can use the following snippet instead
from selenium import webdriver

options = webdriver.WebKitGTKOptions()
options.binary_location = 'epiphany'
options.add_argument('--automation-mode')
capabilities = options.to_capabilities()
capabilities['browserName'] = 'Epiphany'
capabilities['version'] = '3.31.4'

ephy = webdriver.WebKitGTK(desired_capabilities=capabilities)
ephy.get('http://www.webkitgtk.org')
ephy.quit()

To simplify the driver instantation you can create your own Epiphany driver derived from the WebKitGTK one:

from selenium import webdriver

class Epiphany(webdriver.WebKitGTK):
    def __init__(self):
        options = webdriver.WebKitGTKOptions()
        options.binary_location = 'epiphany'
        options.add_argument('--automation-mode')
        options.set_capability('browserName', 'Epiphany')
        options.set_capability('version', '3.31.4')

        webdriver.WebKitGTK.__init__(self, options=options, desired_capabilities={})

ephy = Epiphany()
ephy.get('http://www.webkitgtk.org')
ephy.quit()

The same for selenium < 3.14

from selenium import webdriver

class Epiphany(webdriver.WebKitGTK):
    def __init__(self):
        options = webdriver.WebKitGTKOptions()
        options.binary_location = 'epiphany'
        options.add_argument('--automation-mode')
        capabilities = options.to_capabilities()
        capabilities['browserName'] = 'Epiphany'
        capabilities['version'] = '3.31.4'

        webdriver.WebKitGTK.__init__(self, desired_capabilities=capabilities)

ephy = Epiphany()
ephy.get('http://www.webkitgtk.org')
ephy.quit()

31 de December de 2018

21 de December de 2018

Importar JSON en MySQL usando MySQL Shell

La utilidad MySQL Shell nos permite importar un fichero JSON en una tabla o colección de MySQL.

Primero debemos activar el protocolo mysqlX :

$ mysqlsh -u root -h localhost --mysql --dba enableXProtocol
Please provide the password for 'root@localhost':
Save password for 'root@localhost'? [Y]es/[N]o/Ne[v]er (default No):
enableXProtocol: Installing plugin mysqlx…
enableXProtocol: done

Y ahora ya podemos conectar con el servidor MySQL usando MySQLShell (y el protocolo mysqlX) :

$ mysqlsh -u root -h localhost --mysqlx

Tengo creada una base de datos llamada addi, vacía, y quiero importar ahí el fichero result.json en una colección de nombre addi_collection.

El comando a ejecutar sería :

MySQL Shell > util.importJson("result.json", {schema: "addi", collection: "addi_collection"});
Importing from file "result.json" to collection <code>addi</code>.<code>addi_collection</code> in MySQL Server at localhost:33060

El problema que tuve es que mi fichero json no tenía un campo _id único en cada registro (ver post anterior de ikasten.io), así que tuve que crearlo. Esto no sería un problema en MySQL Server > 8.0, pero estoy usando un server viejuno (5.7.19), así que obtuve este error:

Processed 182.22 KB in 80 documents in 0.0340 sec (2.35K documents/s)
Total successfully imported documents 0 (0.00 documents/s)
Document is missing a required field (MySQL Error 5115)

Tras añadir el campo _id a todos los registros, pude importar sin problemas:

util.importJson("result.json", {schema: "addi", collection: "addi_collection"});
Importing from file "result.json" to collection <code>addi</code>.<code>addi_collection</code> in MySQL Server at localhost:33060
.. 80.. 80
 Processed 182.93 KB in 80 documents in 0.0379 sec (2.11K documents/s)
 Total successfully imported documents 80 (2.11K documents/s)

Más info sobre JSON import utility en MySQL Shell.

El resultado de la importación se guarda en una colección que recuerda a las colecciones de MongoDB

20 de December de 2018

Buscar y reemplazar con valores incrementales en Vim

Supongamos que tenemos un fichero JSON como el siguiente:

{ "clave1" : "valor11", "clave2": "valor12", … }
{ "clave1" : "valor21", "clave2": "valor22", … }
…
{ "clave1" : "valorN1", "clave2": "valorN2", … }

y queremos añadir un campo nuevo al comienzo, con un _id incremental, para que quede así:

{ "_id" : 1, "clave1" : "valor11", "clave2": "valor12", … }
{ "_id" : 2, "clave1" : "valor21", "clave2": "valor22", … }
…
{ "_id" : n, "clave1" : "valorN1", "clave2": "valorN2", … }

En Vim podremos hacerlo definiendo una función:

:let g:incr = 0 
:function Incr() 
:let g:incr = g:incr + 1 
:return g:incr   
:endfu

Una vez definida la función Incr(), podremos invocarla en una orden find&replace con el operador \= que permite evaluar expresiones y hacer la sustitución que buscamos:

Es decir:

:%s/^{/\="{\"_id\":" . Incr() . ","/gc

:%s/cadena_a_buscar/cadena_sustituta/gc

Cadena a buscar: ^{ (que empiece por {)
Cadena sustituta: =«{\»_id\»:» . Incr() . «,» (es decir, evaluar la expresión «_id\»:» . Incr() . «,», que inicialmente será «_id»:1 )
/gc : Cambios globales (a todo el documento, no sólo la primera aparición) y con confirmación (puedes pulsar la tecla «a» (all) cuando veas que los cambios son correctos tras las primeras sustituciones)

Si quieres más info sobre funciones y el lenguaje VimScript, échale un vistazo a este tutorial.

15 de December de 2018

Desactivar Command+c en VirtualBox para macOS

Un tip rápido que me tenía intrigado desde hace tiempo. Si usas VirtualBox en macOS, seguro que al tener una máquina virtual lanzada has pulsado sin querer Command+c (⌘+c) para copiar texto (la combinación por defecto en macOS) en lugar de Ctrl+C (la combinación por defecto en Linux y Windows). El problema es que en VirtualBox la combinación Command+C escala el tamaño de la pantalla (¡y la hace minúscula!). Para desactivar este molesto comportamiento, basta con entrar en las preferencias de VirtualBox (pulsa ⌘ + ,), pestaña Input, pestaña VirtualMachine, pulsa sobre ScaledMode y elimina el dichoso shortcut.

¡ Adiós ⌘+C !

25 de November de 2018

Frogr 1.5 released

It’s almost one year later and, despite the acquisition by SmugMug a few months ago and the predictions from some people that it would mean me stopping from using Flickr & maintaining Frogr, here comes the new release of frogr 1.5.Frogr 1.5 screenshot

Not many changes this time, but some of them hopefully still useful for some people, such as the empty initial state that is now shown when you don’t have any pictures, as requested a while ago already by Nick Richards (thanks Nick!), or the removal of the applications menu from the shell’s top panel (now integrated in the hamburger menu), in line with the “App Menu Retirement” initiative.

Then there were some fixes here and there as usual, and quite so many updates to the translations this time, including a brand new translation to Icelandic! (thanks Sveinn).

So this is it this time, I’m afraid. Sorry there’s not much to report and sorry as well for the long time that took me to do this release, but this past year has been pretty busy between hectic work at Endless the first time of the year, a whole international relocation with my family to move back to Spain during the summer and me getting back to work at Igalia as part of the Chromium team, where I’m currently pretty busy working on the Chromium Servicification project (which is material for a completely different blog post of course).

Anyway, last but not least, feel free to grab frogr from the usual places as outlined in its main website, among which I’d recommend the Flatpak method, either via GNOME Software  or from the command line by just doing this:

flatpak install --from \
    https://flathub.org/repo/appstream/org.gnome.frogr.flatpakref

For more information just check the main website, which I also updated to this latest release, and don’t hesitate to reach out if you have any questions or comments.

Hope you enjoy it. Thanks!

15 de November de 2018

On the track for 3.32

It happens sneakily, but there’s more things going on in the Tracker front than the occasional fallout. Yesterday 2.2.0-alpha1 was released, containing some notable changes.

On and off during the last year, I’ve been working on a massive rework of the SPARQL parser. The current parser was fairly solid, but hard to extend for some of the syntax in the SPARQL 1.1 spec. After multiple attempts and failures at implementing property paths, I convinced myself this was the way forward.

The main difference is that the previous parser was more of a serializer to SQL, just minimal state was preserved across the operation. The new parser does construct an expression tree so that nodes may be shuffled/reevaluated. This allows some sweet things:

  • Property paths are a nice resource to write more idiomatic SPARQL, most property path operators are within reach now. There’s currently support for sequence paths:

    # Get all files in my homedir
    SELECT ?elem {
      ?elem nfo:belongsToContainer/nie:url 'file:///home/carlos'
    }
    


    And inverse paths:

    # Get all files in my homedir by inverting
    # the child to container relation
    SELECT ?elem {
      ?homedir nie:url 'file:///home/carlos' ;
               ^nfo:belongsToContainer ?elem
    }
    

    There’s harder ones like + and * that will require recursive selects, and there’s the negation (!) operator which is not possible to implement yet.

  • We now have prepared statements! A TrackerSparqlStatement object was introduced, capable of holding a query with parameters which can be set/replaced prior to execution.

    conn = tracker_sparql_connection_get (NULL, NULL);
    stmt = tracker_sparql_connection_query_statement (conn,
                                                      "SELECT ?u { ?u fts:match ~term }",
                                                      NULL, NULL);
    
    tracker_sparql_statement_bind_string (stmt, "term", search_term);
    cursor = tracker_sparql_statement_execute (stmt, NULL, NULL);
    

    This is a long sought protection for injections. The object is cacheable and can service multiple cursors asynchronously, so it will also be an improvement for frequent queries.

  • More concise SQL is generated at places, which brings slight improvements on SQLite query planning.

This also got the ideas churning towards future plans, the trend being a generic triple store as much sparql1.1 capable as possible. There’s also some ideas about better data isolation for Flatpak and sandboxes in general (seeing the currently supported approach didn’t catch on). Those will eventually happen in this or following cycles, but I’ll reserve that for other blog post.

An eye was kept on memory usage too (mostly unrealized ideas from the performance hackfest earlier this year), tracker-store has been made to automatically shutdown when unneeded (ideally most of the time, since it just takes care of updates and the unruly apps that use the bus connection), and tracker-miner-fs took over the functionality of tracker-miner-apps. That’s 2 processes less in your default session.

In general, we’re on the way to an exciting release, and there’s more to come!

13 de November de 2018

Degree final work about ISO/IEC 29110

Cover of «Creation of artifacts for adoption of ISO/IEC 29110 standards» blueprint

I want a lot to write more in this blog. There are matters I didn’t talk enough about SuperSEC or GUADEC conferences, some announce for 2019 and some some activities in Wikipedia (specially in the Wikiproyecto-Almería and my firsts step in the amazing world of SPARQL), less important but I really enjoy.

But now I want to keep record of significant advances in the university degree I’m finishing these months. I decided to finish a pending course with special interest in the required degree final work, to work in things I’ve been interested since 2003 but never had the oportunity to focus in deep enough to study, learn and write some useful, I hope, tools. And it’s being fun :-)

29110 Galore at http://29110.olea.org

So now I can say the project blueprint has been approved by the university. It’s named «Creation of artifacts for adoption of ISO/IEC 29110 standards» (document in Spanish, sorry) and the goals are to produce a set of opensource artifacts for the adoption of the 29110 family of standards focused on a light software engineering methodology suitable to be adopted by very small entities (VSEs). At the moment my main target is to work in the «Part 5-4: Agile software development guidelines», currently on development by WG24, using the EPF Composer tool.

As a working tool I’m making a (half backed and maybe temporal) website to keep record of related materials at http://29110.olea.org.

Hope to announce related news in the next weeks.

01 de November de 2018

Running EPF Composer in Fedora Linux, v3

Well, finally I succeed with native instalation of the EPF (Eclipse Process Framework) Composer in my Linux system thanks to Bruce MacIsaac and the development team help. I’m happy. This is not trivial since EPFC is a 32 bits application running in a modern 64 bits Linux system.

My working configuration:

  • Fedora F28, x86_64
  • java-1.8.0-oracle-1.8.0.181, 32 bits, from the non-free Russian Fedora repository:
    • java-1.8.0-oracle-1.8.0.181-3.fc28.i586.rpm
    • java-1.8.0-oracle-headless-1.8.0.181-3.fc28.i586.rpm
  • EPF Composer Linux/GTK 1.5.2
  • GTK+ v.2 integration dependencies (from main Fedora repository):
    • adwaita-gtk2-theme-3.28-1.fc28.i686.rpm
    • libcanberra-gtk2-0.30-16.fc28.i686.rpm
  • xulrunner 32 bits xulrunner-10.0.2.en-US.linux-i686.tar.bz2
  • libXt-1.1.5-7.fc28.i686.rpm (from main Fedora repository).

In my system obviously I can install all rpm packages using DNF. For different distros look for the equivalent packages.

Maybe I’m missing some minor dependency, I didn’t checked in a clean instalation.

Download EPFC and xulrunner and extract each one in the path of your choice. I’m using xulrunner-10.0.2.en-US.linux-i686/ as directory name to be more meaninful.

The contents of epf.ini file:

-data
@user.home/EPF/workspace.152
-vmargs
-Xms64m
-Xmx512m
-Dorg.eclipse.swt.browser.XULRunnerPath=/PATHTOXULRUNNER/xulrunner-10.0.2.en-US.linux-i686/

I had to write the full system path for the -Dorg.eclipse.swt.browser.XULRunnerPath property to get Eclipse recognize it.

And to run EPF Composer:

cd $EPF_APP_DIR
$ epf -vm  /usr/lib/jvm/java-1.8.0-oracle-1.8.0.181/jre/bin/java  

If you want some non trivial work with Composer in Linux you’ll need xulrunner since it’s used extensively for editing contents.

Native Linux EPF Composer screenshot

I had success running the Windows EPF version using Wine and I can do some work with it, but at some point the program gets inestable and needs to reboot. Other very interesting advantage of running native is I can use the GTK+ filechooser which is really lot better than the simpler native Java one.

I plan to practice a lot modeling with EPF Composer in the coming weeks. Hopefully I’ll share some new artifacts authored by me.

PD: added the required libXt dependency.

25 de October de 2018

3 events in a month

As part of my job at Igalia, I have been attending 2-3 events per year. My role mostly as a Chromium stack engineer is not usually much demanding regarding conference trips, but they are quite important as an opportunity to meet collaborators and project mates.

This month has been a bit different, as I ended up visiting Santa Clara LG Silicon Valley Lab in California, Igalia headquarters in A Coruña, and Dresden. It was mostly because I got involved in the discussions for the web runtime implementation being developed by Igalia for AGL.

AGL f2f at LGSVL

It is always great to visit LG Silicon Valley Lab (Santa Clara, US), where my team is located. I have been participating for 6 years in the development of the webOS web stack you can most prominently enjoy in LG webOS smart TV.

One of the goals for next months at AGL is providing an efficient web runtime. In LGSVL we have been developing and maintaining WAM, the webOS web runtime. And as it was released with an open source license in webOS Open Source Edition, it looked like a great match for AGL. So my team did a proof of concept in May and it was succesful. At the same time Igalia has been working on porting Chromium browser to AGL. So, after some discussions AGL approved sponsoring my company, Igalia for porting the LG webOS web runtime to AGL.

As LGSVL was hosting the september 2018 AGL f2f meeting, Igalia sponsored my trip to the event.

AGL f2f Santa Clara 2018, AGL wiki CC BY 4.0

So we took the opportunity to continue discussions and progress in the development of the WAM AGL port. And, as we expected, it was quite beneficial to unblock tasks like AGL app framework security integration, and the support of AGL latest official release, Funky Flounder. Julie Kim from Igalia attended the event too, and presented an update on the progress of the Ozone Wayland port.

The organization and the venue were great. Thanks to LGSVL!

Web Engines Hackfest 2018 at Igalia

Next trip was definitely closer. Just 90 minutes drive to our Igalia headquarters in A Coruña.


Igalia has been organizing this event since 2009. It is a cross-web-engine event, where engineers of Mozilla, Chromium and WebKit have been meeting yearly to do some hacking, and discuss the future of the web.

This time my main interest was participating in the discussions about the effort by Igalia and Google to support Wayland natively in Chromium. I was pleased to know around 90% of the work had already landed in upstream Chromium. Great news as it will smooth integration of Chromium for embedders using Ozone Wayland, like webOS. It was also great to know the work for improving GPU performance reducing the number of copies required for painting web contents.

Web Engines Hackfest 2018 CC BY-SA 2.0

Other topics of my interest:
– We did a follow-up of the discussion in last BlinkOn about the barriers for Chromium embedders, sharing the experiences maintaining a downstream Chromium tree.
– Joined the discussions about the future of WebKitGTK. In particular the graphics pipeline adaptation to the upcoming GTK+ 4.

As usual, the organization was great. We had 70 people in the event, and it was awesome to see all the activity in the office, and so many talented engineers in the same place. Thanks Igalia!

Web Engines Hackfest 2018 CC BY-SA 2.0

AGL All Members Meeting Europe 2018 at Dresden

The last event in barely a month was my first visit to the beautiful town of Dresden (Germany).

The goal was continuing the discussions for the projects Igalia is developing for AGL platform: Chromium upstream native Wayland support, and the WAM web runtime port. We also had a booth showcasing that work, but also our lightweight WebKit port WPE that was, as usual, attracting interest with its 60fps video playback performance in a Raspberry Pi 2.

I co-presented with Steve Lemke a talk about the automotive activities at LGSVL, taking the opportunity to update on the status of the WAM web runtime work for AGL (slides here). The project is progressing and Igalia should be landing soon the first results of the work.

Igalia booth at AGL AMM Europe 2018

It was great to meet all this people, and discuss in person the architecture proposal for the web runtime, unblocking several tasks and offering more detailed planning for next months.

Dresden was great, and I can’t help highlighting the reception and guided tour in the Dresden Transportation Museum. Great choice by the organization. Thanks to Linux Foundation and the AGL project community!

Next: Chrome Dev Summit 2018

So… what’s next? I will be visiting San Francisco in November for Chrome Dev Summit.

I can only thank Igalia for sponsoring my attendance to these events. They are quite important for keeping things moving forward. But also, it is also really nice to meet friends and collaborators. Thanks Igalia!

18 de October de 2018

How to cite bibliography ISO/IEC standards

For my final post-grade work I’m collecting bibliography and as the main work is around ISO/IEC documents I investigated how to to make a correct bibliography entry for these, which I realized is not very well known as you can check in this question in Tex.StackSchange.com.

I finally chose an style I show here as an example:

  • BibTeX:
    @techreport{iso_central_secretary_systems_2016,
      address = {Geneva, CH},
      type = {Standard},
      title = {Systems and software engineering -- {Lifecycle} profiles for {Very} {Small} {Entities} ({VSEs}) -- {Part} 1: {Overview}},
      shorttitle = {{ISO}/{IEC} {TR} 29110-1:2016},
      url = {https://www.iso.org/standard/62711.html},
      language = {en},
      number = {ISO/IEC TR 29110-1:2016},
      institution = {International Organization for Standardization},
      author = {{ISO Central Secretary}},
      year = {2016}
    }
    
  • RIS:
      TY  - RPRT
      TI  - Systems and software engineering -- Lifecycle profiles for Very Small Entities (VSEs) -- Part 1: Overview
      AU  - ISO Central Secretary
      T2  - ISO/IEC 29110
      CY  - Geneva, CH
      PY  - 2016
      LA  - en
      M3  - Standard
      PB  - International Organization for Standardization
      SN  - ISO/IEC TR 29110-1:2016
      ST  - ISO/IEC TR 29110-1:2016
      UR  - https://www.iso.org/standard/62711.html
      ER  - 
    

    I’ve using this style extensively in a development website http://29110.olea.org/. You can compare details with the official info.

    Both have been generated using Zotero.

07 de October de 2018

Banksy Shredder




PD: After some reports about male nudity this post has been edited to remove the portrait of my back. If you have reservations with male nudity PLEASE DON'T FOLLOW THE LINK.

PPD: If you don't have problems with male nudity for your convenience here you'll find the Wiki Commons category «Nude men» of pictures.

«Software Quality Assurance, First Edition» PDF file

Print ISBN:9781118501825, Online ISBN:9781119312451, DOI:10.1002/9781119312451

For your convenience I’ve compiled in just one file the book Software Quality Assurance by Claude Y. Laporte and Alain April. The book is provided for free download at the publisher website as separated files. Download the full book.

About the book: «This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace.»

It is licensed as Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Claude Y. Laporte is the editor of the ISO/IEC 29110 standard of software engineering for very small entities (VSE).

PD: Added licencing details.

04 de October de 2018

GUADEC 2018 by numbers

GUADEC 2018 badge

It took me a while but now I can gave you some stats from GUADEC 2018, following past year Sam’s example.

They are very rough but I hope informative.

  • Attendees: 207 (and about 215 registered), two less than 2017.
  • 9 days: 2 days boards meetings, 3 for conferences and 4 for BoFs and workshops.
  • 44 talks and videos.
  • 35 BoFs and workshops.
  • 3 great parties, including the flamenco show by «la Chinelita and group».
  • About economics just to say it was very successful. Thanks a lot to our sponsors and donnors. And special kudos to the sponsoring team for such impresive job.
  • The medium age was 32.7, from the 143 persons who provided their age. Minimum age was 15 and maxium 61.
  • The Spanish attendants were 43, 20.8% of the total. Significative presence from the UK with 20%.
  • The maximum of persons per day hosted at Civitas residence where 75. I would have expected a bigger number but some factors affected: GUADEC dates were in high season with a particular peak because some local events (at some point there were not a single room available in Almería city), other was a coincidence with a university summer course and finally many people tried to book at Civitas very late.

The attendees who filled they country of residence, by country:

country number
Argentina 1
Australia 1
Austria 2
Belgium 1
Brazil 2
Canada 3
China 3
Czech Republic 10
Denmark 2
Finland 1
France 9
Germany 11
Greece 1
India 2
Israel 1
Italy 3
Japan 1
Latvia 1
Netherlands 2
New Zealand 1
Norway 2
Romania 3
Russian Federation 2
Spain 43
Sri Lanka 1
Sweden 2
Switzerland 1
United Kingdom 41
United States 25
Unspecified 18


The GUADEC occupancy at Civitas were:

date               number
02/07/2018 1
03/07/2018 7
04/07/2018 21
05/07/2018 66
06/07/2018 70
07/07/2018 71
08/07/2018 75
09/07/2018 70
10/07/2018 63
11/07/2018 40
12/07/2018 5
13/07/2018 1


Thanks Benjamin Berg for helping to collect the data.

Thank you all for visiting us in Almería. Don’t forget to come back :-)

PD: post edited at 2018/11/15 adding details of residence occupancy.

02 de October de 2018

Wacom's graphic tablet sizes (2)

In a previous entry I put the data I’ve collected about Wacom digitizer tablets. Collecting the data took to me more time I really wished. But now I’m happy to publish an exhaustive list thanks to Carlos Garnacho:

model active area size mm active area size in
Wacom ISDv4 E2 356 ✕ 203 mm 14 ✕ 8 in
Wacom Intuos BT M 254 ✕ 203 mm 10 ✕ 8 in
Wacom ISDv4 104 229 ✕ 127 mm 9 ✕ 5 in
Wacom Intuos BT S 203 ✕ 152 mm 8 ✕ 6 in
Wacom Intuos M 254 ✕ 203 mm 10 ✕ 8 in
Wacom Intuos S 203 ✕ 152 mm 8 ✕ 6 in
Wacom ISDv4 5110 305 ✕ 178 mm 12 ✕ 7 in
Wacom Bamboo Pen medium 152 ✕ 102 mm 6 ✕ 4 in
Wacom Bamboo Fun medium (2+FG) 229 ✕ 127 mm 9 ✕ 5 in
Wacom DTU-2231 483 ✕ 279 mm 19 ✕ 11 in
Wacom Bamboo Pen small 152 ✕ 102 mm 6 ✕ 4 in
Wacom Cintiq 21UX2 432 ✕ 330 mm 17 ✕ 13 in
Wacom Graphire Wireless 203 ✕ 152 mm 8 ✕ 6 in
ELAN 2537 356 ✕ 203 mm 14 ✕ 8 in
Wacom ISDv4 5002 279 ✕ 152 mm 11 ✕ 6 in
Wacom ISDv4 5000 279 ✕ 152 mm 11 ✕ 6 in
Wacom ISDv4 485e 254 ✕ 178 mm 10 ✕ 7 in
Wacom Bamboo (2+FG) 127 ✕ 76 mm 5 ✕ 3 in
Wacom DTH1152 229 ✕ 127 mm 9 ✕ 5 in
Wacom Bamboo Fun small (2+FG) 152 ✕ 102 mm 6 ✕ 4 in
Wacom Bamboo Pen & Touch (2+FG) 152 ✕ 102 mm 6 ✕ 4 in
Wacom Cintiq 22HD touch 483 ✕ 279 mm 19 ✕ 11 in
Wacom DTI520UB/L 356 ✕ 305 mm 14 ✕ 12 in
Wacom Bamboo Fun medium (2FG) 229 ✕ 127 mm 9 ✕ 5 in
Wacom Bamboo Fun small (2FG) 152 ✕ 102 mm 6 ✕ 4 in
Wacom Bamboo (2FG) 152 ✕ 102 mm 6 ✕ 4 in
Wacom Bamboo Touch (2FG) 127 ✕ 76 mm 5 ✕ 3 in
Huion H610 Pro 254 ✕ 152 mm 10 ✕ 6 in
Bamboo One 127 ✕ 102 mm 5 ✕ 4 in
Wacom Bamboo Pen 152 ✕ 102 mm 6 ✕ 4 in
Wacom Intuos BT M 254 ✕ 203 mm 10 ✕ 8 in
Wacom ISDv4 12C 279 ✕ 152 mm 11 ✕ 6 in
Wacom Intuos BT S 203 ✕ 152 mm 8 ✕ 6 in
Wacom Intuos4 WL 203 ✕ 127 mm 8 ✕ 5 in
Wacom Intuos4 12x19 483 ✕ 305 mm 19 ✕ 12 in
Wacom Intuos4 8x13 330 ✕ 203 mm 13 ✕ 8 in
Wacom ISDv4 5146 305 ✕ 178 mm 12 ✕ 7 in
Wacom Cintiq Pro 13 305 ✕ 178 mm 12 ✕ 7 in
Wacom MobileStudio Pro 16 356 ✕ 203 mm 14 ✕ 8 in
Wacom MobileStudio Pro 13 305 ✕ 178 mm 12 ✕ 7 in
Wacom ISDv4 484c 254 ✕ 178 mm 10 ✕ 7 in
Wacom DTU-1931 381 ✕ 305 mm 15 ✕ 12 in
Wacom Cintiq 12WX 254 ✕ 178 mm 10 ✕ 7 in
Wacom Cintiq 20WSX 432 ✕ 279 mm 17 ✕ 11 in
Wacom Cintiq 21UX 432 ✕ 330 mm 17 ✕ 13 in
Wacom ISDv4 4004 279 ✕ 152 mm 11 ✕ 6 in
Wacom Cintiq Pro 16 356 ✕ 203 mm 14 ✕ 8 in
Wacom DTF-720 330 ✕ 279 mm 13 ✕ 11 in
Wacom Intuos Pro 2 L 305 ✕ 203 mm 12 ✕ 8 in
Wacom Intuos Pro 2 M 229 ✕ 152 mm 9 ✕ 6 in
Wacom DTH2242 483 ✕ 279 mm 19 ✕ 11 in
Wacom ISDv4 5099 254 ✕ 178 mm 10 ✕ 7 in
Wacom DTK2241 483 ✕ 279 mm 19 ✕ 11 in
Wacom Cintiq Pro 32 686 ✕ 381 mm 27 ✕ 15 in
Wacom Cintiq Pro 24 PT 508 ✕ 305 mm 20 ✕ 12 in
Wacom Intuos3 12x19 483 ✕ 305 mm 19 ✕ 12 in
Wacom Intuos4 6x9 229 ✕ 152 mm 9 ✕ 6 in
Wacom Intuos4 4x6 152 ✕ 102 mm 6 ✕ 4 in
Wacom Intuos Pro 2 L WL 305 ✕ 203 mm 12 ✕ 8 in
Wacom Intuos Pro 2 M WL 229 ✕ 152 mm 9 ✕ 6 in
Wacom Intuos3 4x6 152 ✕ 102 mm 6 ✕ 4 in
Wacom Intuos3 6x8 203 ✕ 152 mm 8 ✕ 6 in
Intuos Pen & Touch Medium 229 ✕ 127 mm 9 ✕ 5 in
Wacom ISDv4 5013 305 ✕ 178 mm 12 ✕ 7 in
Wacom ISDv4 5014 254 ✕ 152 mm 10 ✕ 6 in
Wacom Intuos4 WL 203 ✕ 127 mm 8 ✕ 5 in
Intuos Pen Medium 229 ✕ 127 mm 9 ✕ 5 in
Intuos Pen & Touch Small 152 ✕ 102 mm 6 ✕ 4 in
Intuos Pen Small 152 ✕ 102 mm 6 ✕ 4 in
Wacom ISDv4 50f8 356 ✕ 203 mm 14 ✕ 8 in
Huion H610 Pro 254 ✕ 152 mm 10 ✕ 6 in
Wacom ISDv4 504a 305 ✕ 178 mm 12 ✕ 7 in
Wacom Intuos3 6x11 279 ✕ 152 mm 11 ✕ 6 in
Wacom Intuos3 12x12 305 ✕ 305 mm 12 ✕ 12 in
Wacom Intuos3 9x12 305 ✕ 229 mm 12 ✕ 9 in
XP-Pen Star 03 254 ✕ 152 mm 10 ✕ 6 in
Wacom ISDv4 50f1 305 ✕ 178 mm 12 ✕ 7 in
Wacom Intuos3 4x5 127 ✕ 102 mm 5 ✕ 4 in
Wacom DTK1651 356 ✕ 203 mm 14 ✕ 8 in
Wacom ISDv4 10D 279 ✕ 152 mm 11 ✕ 6 in
Wacom ISDv4 10F 279 ✕ 152 mm 11 ✕ 6 in
Wacom ISDv4 10E 279 ✕ 152 mm 11 ✕ 6 in
Wacom Intuos2 12x18 457 ✕ 305 mm 18 ✕ 12 in
Wacom Intuos2 12x12 305 ✕ 305 mm 12 ✕ 12 in
Wacom Intuos2 9x12 305 ✕ 229 mm 12 ✕ 9 in
Wacom Intuos2 6x8 203 ✕ 152 mm 8 ✕ 6 in
Wacom Intuos2 4x5 127 ✕ 102 mm 5 ✕ 4 in
Wacom DTU1031X 229 ✕ 127 mm 9 ✕ 5 in
Wacom Cintiq 27QHD 610 ✕ 305 mm 24 ✕ 12 in
Wacom ISDv4 503E 305 ✕ 178 mm 12 ✕ 7 in
Wacom ISDv4 117 279 ✕ 152 mm 11 ✕ 6 in
Wacom ISDv4 116 203 ✕ 152 mm 8 ✕ 6 in
Wacom ISDv4 503F 305 ✕ 178 mm 12 ✕ 7 in
Wacom ISDv4 50b8 305 ✕ 178 mm 12 ✕ 7 in
Wacom Cintiq 27QHD touch 610 ✕ 305 mm 24 ✕ 12 in
Wacom ISDv4 50b6 305 ✕ 178 mm 12 ✕ 7 in
Wacom ISDv4 50b4 305 ✕ 178 mm 12 ✕ 7 in
Wacom Intuos5 M 229 ✕ 152 mm 9 ✕ 6 in
Wacom DTU1141 229 ✕ 127 mm 9 ✕ 5 in
Wacom ISDv4 5048 254 ✕ 152 mm 10 ✕ 6 in
Wacom Cintiq 13HD touch 305 ✕ 178 mm 12 ✕ 7 in
Wacom ISDv4 5044 254 ✕ 152 mm 10 ✕ 6 in
Wacom ISDv4 4831 305 ✕ 178 mm 12 ✕ 7 in
Wacom ISDv4 5040 305 ✕ 178 mm 12 ✕ 7 in
Huion H610 Pro 254 ✕ 152 mm 10 ✕ 6 in
Dell Canvas 27 584 ✕ 330 mm 23 ✕ 13 in
Wacom Cintiq Companion 2 305 ✕ 178 mm 12 ✕ 7 in
Wacom Cintiq 22HD 483 ✕ 279 mm 19 ✕ 11 in
Wacom ISDv4 101 279 ✕ 152 mm 11 ✕ 6 in
Wacom ISDv4 100 279 ✕ 152 mm 11 ✕ 6 in
Wacom ISDv4 481a 305 ✕ 178 mm 12 ✕ 7 in
Wacom ISDv4 93 254 ✕ 152 mm 10 ✕ 6 in
Wacom DTU1031 229 ✕ 127 mm 9 ✕ 5 in
Wacom Intuos5 touch L 330 ✕ 203 mm 13 ✕ 8 in
N-Trig Pen 254 ✕ 152 mm 10 ✕ 6 in
Wacom ISDv4 509D 305 ✕ 178 mm 12 ✕ 7 in
Wacom Intuos5 S 152 ✕ 102 mm 6 ✕ 4 in
Wacom Intuos5 touch S 152 ✕ 102 mm 6 ✕ 4 in
Wacom Intuos5 touch M 229 ✕ 152 mm 9 ✕ 6 in
Intuos Pen Medium 229 ✕ 127 mm 9 ✕ 5 in
Wacom ISDv4 4824 102 ✕ 178 mm 4 ✕ 7 in
Wacom Intuos 12x18 457 ✕ 305 mm 18 ✕ 12 in
Wacom ISDv4 4822 279 ✕ 152 mm 11 ✕ 6 in
Wacom Intuos 12x12 305 ✕ 305 mm 12 ✕ 12 in
Wacom Intuos 9x12 305 ✕ 229 mm 12 ✕ 9 in
Wacom Intuos 6x8 203 ✕ 152 mm 8 ✕ 6 in
Wacom Intuos 4x5 127 ✕ 102 mm 5 ✕ 4 in
Wacom ISDv4 90 305 ✕ 203 mm 12 ✕ 8 in
Wacom Cintiq 24HD touch 533 ✕ 330 mm 21 ✕ 13 in
Intuos Pen Small 152 ✕ 102 mm 6 ✕ 4 in
Wacom Bamboo Pad 102 ✕ 76 mm 4 ✕ 3 in
Wacom ISDv4 124 229 ✕ 127 mm 9 ✕ 5 in
Wacom Cintiq Companion 305 ✕ 178 mm 12 ✕ 7 in
Wacom Bamboo Pad Wireless 102 ✕ 76 mm 4 ✕ 3 in
Wacom DTH2452 508 ✕ 305 mm 20 ✕ 12 in
Wacom Cintiq Pro 24 P 508 ✕ 305 mm 20 ✕ 12 in
Wacom ISDv4 93 254 ✕ 152 mm 10 ✕ 6 in
Wacom Graphire 127 ✕ 102 mm 5 ✕ 4 in
Wacom ISDv4 90 305 ✕ 203 mm 12 ✕ 8 in
Wacom Intuos Pro L 330 ✕ 203 mm 13 ✕ 8 in
Wacom Intuos Pro M 229 ✕ 152 mm 9 ✕ 6 in
Wacom Intuos Pro S 152 ✕ 102 mm 6 ✕ 4 in
Wacom Graphire2 4x5 127 ✕ 102 mm 5 ✕ 4 in
Wacom ISDv4 4814 254 ✕ 178 mm 10 ✕ 7 in
Wacom Graphire4 4x5 127 ✕ 102 mm 5 ✕ 4 in
Wacom Graphire3 6x8 203 ✕ 152 mm 8 ✕ 6 in
Wacom Graphire3 4x5 127 ✕ 102 mm 5 ✕ 4 in
Wacom Graphire2 5x7 178 ✕ 127 mm 7 ✕ 5 in
One by Wacom (medium) 229 ✕ 127 mm 9 ✕ 5 in
One by Wacom (small) 152 ✕ 102 mm 6 ✕ 4 in
Wacom Cintiq 24HD 533 ✕ 330 mm 21 ✕ 13 in
Wacom DTU-1631 356 ✕ 203 mm 14 ✕ 8 in
Wacom Bamboo Special Edition Pen & Touch medium 229 ✕ 127 mm 9 ✕ 5 in
Wacom Bamboo Create 152 ✕ 102 mm 6 ✕ 4 in
Wacom ISDv4 114 229 ✕ 127 mm 9 ✕ 5 in
Wacom DTK2451 508 ✕ 305 mm 20 ✕ 12 in
Wacom Bamboo Capture 152 ✕ 102 mm 6 ✕ 4 in
Wacom Bamboo Connect 152 ✕ 102 mm 6 ✕ 4 in
Wacom Bamboo 16FG 4x5 152 ✕ 102 mm 6 ✕ 4 in
Wacom ISDv4 5090 279 ✕ 152 mm 11 ✕ 6 in
Wacom Bamboo Special Edition Pen & Touch small 152 ✕ 102 mm 6 ✕ 4 in
Wacom Cintiq Companion Hybrid 305 ✕ 178 mm 12 ✕ 7 in
Wacom ISDv4 4809 102 ✕ 178 mm 4 ✕ 7 in
XP-Pen Star 03 254 ✕ 152 mm 10 ✕ 6 in
Huion H610 Pro 254 ✕ 152 mm 10 ✕ 6 in
Wacom Cintiq 13HD 305 ✕ 178 mm 12 ✕ 7 in
Intuos Pen & Touch Medium 229 ✕ 127 mm 9 ✕ 5 in
Intuos Pen & Touch Small 152 ✕ 102 mm 6 ✕ 4 in
One by Wacom (medium) 229 ✕ 127 mm 9 ✕ 5 in
One by Wacom (small) 152 ✕ 102 mm 6 ✕ 4 in
Wacom ISDv4 5010 279 ✕ 152 mm 11 ✕ 6 in
Wacom ISDv4 E6 279 ✕ 152 mm 11 ✕ 6 in
Wacom ISDv4 E5 279 ✕ 152 mm 11 ✕ 6 in


As a reference these are the standards DIN sizes comparable with those models:

DIN type   size
A4 210 x 297 mm
A5 148 x 210 mm
A6 105 x 148 mm


The source of this data is directly from the Wacom driver and is extracted with this C program Carlos provided:


/* Build with:
 *   gcc -o wacomfoo `pkg-config --libs --cflags libwacom` wacomfoo.c
 */
 
#define IN_TO_MM 25.4
 
#include <libwacom/libwacom.h>
 
int
main (int argc, char *argv[])
{
  const WacomDeviceDatabase *db;
  WacomDevice **devices;
  int i;
 
  db = libwacom_database_new ();
  devices = libwacom_list_devices_from_database (db, NULL);
  printf ("| model | active area size mm | active area size in | \n");
  printf ("|:--------- |:--------- |:--------- | \n");

  for (i = 0; devices[i] != NULL; i++)
    {
      if (libwacom_get_width (devices[i]) == 0)
        continue;
      printf ("| %s | %.f ✕ %.f mm | %.f ✕ %.f in | \n",
              libwacom_get_name (devices[i]),
              (double) libwacom_get_width (devices[i]) * IN_TO_MM,
              (double) libwacom_get_height (devices[i]) * IN_TO_MM,
              (double) libwacom_get_width (devices[i]),
              (double) libwacom_get_height (devices[i]));
    }
  return 0;
}

PD: Fixed the correct value for milimeters per inch.

23 de September de 2018

Wacom's graphic tablet sizes

For some reasons I’ve been looking for second hand Wacom graphic tablets. To me has been annoying to find out which size is for each model. So I’m writing here the list of the models I gathered.

The reason for looking only for Wacoms is because these days seem to be very well supported in Linux, at least the old models you can get second hand.

model active area size
CTL460 147,2 x 92,0 mm
CTL 420 127.6 x 92.8 mm
CTE-430 Graphire 3 127 x 101 mm
CTF-430 127.6 x 92.8 mm
CTL 460 147,2 x 92,0 mm
CTH-460 147,2 x 92,0 mm
CTH-461 147,2 x 92,0 mm
CTH-470 147,2 x 92,0 mm
CTL-470 147,2 x 92,0 mm
CTL-480 Intuos 152 x 95 mm
CTE-640 208.8 x 150.8 mm
CTE-650 216.5 x 135.3 mm
CTH-661 215.9 x 137.16 mm
CTH-670 217 x 137 mm
ET-0405A-U 127 x 106 mm
Graphire 2 127.6 x 92.8 mm
Intuos 2 127.6 x 92.8 mm (probably)
Volito 2 127.6 x 92.8 mm


As a reference these are the standards DIN sizes comparable with those models:

DIN type   size
A4 210 x 297 mm
A5 148 x 210 mm
A6 105 x 148 mm

If you find any typo or want to add other models feel free to comment.

PD: This post has been obsoleted by a new entry.

03 de August de 2018

On Moving

Winds of Change. One of my favourite songs ever and one that comes to my mind now that me and my family are going through quite some important changes, once again. But let’s start from the beginning…

A few years ago, back in January 2013, my family and me moved to the UK as the result of my decision to leave Igalia after almost 7 years in the company to embark ourselves in the “adventure” or living abroad. This was an idea we had been thinking about for a while already at that time, and our current situation back then suggested that it could be the right moment to try it out… so we did.

It was kind of a long process though: I first arrived alone in January to make sure I would have time to figure things out and find a permanent place for us to live in, and then my family joined me later in May, once everything was ready. Not great, if you ask me, to be living separated from your loved ones for 4 full months, not to mention the juggling my wife had to do during that time to combine her job with looking after the kids mostly on her own… but we managed to see each other every 2-3 weekends thanks to the London – Coruña direct flights in the meantime, so at least it was bearable from that point of view.

But despite of those not so great (yet expected) beginnings, I have to say that this past 5+ years have been an incredible experience overall, and we don’t have a single regret about making the decision to move, maybe just a few minor and punctual things only if I’m completely honest, but that’s about it. For instance, it’s been just beyond incredible and satisfying to see my kids develop their English skills “from zero to hero”, settle at their school, make new friends and, in one word, evolve during these past years. And that alone would have been a good reason to justify the move already, but it turns out we also have plenty of other reasons as we all have evolved and enjoyed the ride quite a lot as well, made many new friends, knew many new places, worked on different things… a truly enriching experience indeed!

In a way, I confess that this could easily be one of those things we’d probably have never done if we knew in advance of all the things we’d have to do and go through along the way, so I’m very grateful for that naive ignorance, since that’s probably how we found the courage, energy and time to do it. And looking backwards, it seems clear to me that it was the right time to do it.

But now it’s 2018 and, even though we had such a great time here both from personal and work-related perspectives, we have decided that it’s time for us to come back to Galicia (Spain), and try to continue our vital journey right from there, in our homeland.

And before you ask… no, this is not because of Brexit. I recognize that the result of the referendum has been a “contributing factor” (we surely didn’t think as much about returning to Spain before that 23 of June, that’s true), but there were more factors contributing to that decision, which somehow have aligned all together to tell us, very clearly, that Now It’s The Time…

For instance, we always knew that we would eventually move back for my wife to take over the family business, and also that we’d rather make the move in a way that it would be not too bad for our kids when it happened. And having a 6yo and a 9yo already it feels to us like now it’s the perfect time, since they’re already native English speakers (achievement unlocked!) and we believe that staying any longer would only make it harder for them, especially for my 9yo, because it’s never easy to leave your school, friends and place you call home behind when you’re a kid (and I know that very well, as I went through that painful experience precisely when I was 9).

Besides that, I’ve also recently decided to leave Endless after 4 years in the company and so it looks like, once again, moving back home would fit nicely with that work-related change, for several reasons. Now, I don’t want to enter into much detail on why exactly I decided to leave Endless, so I think I’ll summarize it as me needing a change and a rest after these past years working on Endless OS, which has been an equally awesome and intense experience as you can imagine. If anything, I’d just want to be clear on that contributing to such a meaningful project surrounded by such a team of great human beings, was an experience I couldn’t be happier and prouder about, so you can be certain it was not an easy decision to make.

Actually, quite the opposite: a pretty hard one I’d say… but a nice “side effect” of that decision, though, is that leaving at this precise moment would allow me to focus on the relocation in a more organized way as well as to spend some quality time with my family before leaving the UK. Besides, it will hopefully be also useful for us to have enough time, once in Spain, to re-organize our lives there, settle properly and even have some extra weeks of true holidays before the kids start school and we start working again in September.

Now, taking a few weeks off and moving back home is very nice and all that, but we still need to have jobs, and this is where our relocation gets extra interesting as it seems that we’re moving home in multiple ways at once…

For once, my wife will start taking over the family business with the help of her dad in her home town of Lalín (Pontevedra), where we plan to be living for the foreseeable future. This is the place where she grew up and where her family and many friends live in, but also a place she hasn’t lived in for the last 15 years, so the fact that we’ll be relocating there is already quite a thing in the “moving back home” department for her…

Second, for my kids this will mean going back to having their relatives nearby once again as well as friends they only could see and play with during holidays until now, which I think it’s a very good thing for them. Of course, this doesn’t feel as much moving home for them as it does for us, since they obviously consider the UK their home for now, but our hope is that it will be ok in the medium-long term, even though it will likely be a bit challenging for them at the beginning.

Last, I’ll be moving back to work at Igalia after almost 6 years since I left which, as you might imagine, feels to me very much like “moving back home” too: I’ll be going back to working in a place I’ve always loved so much for multiple reasons, surrounded by people I know and who I consider friends already (I even would call some of them “best friends”) and with its foundations set on important principles and values that still matter very much to me, both from technical (e.g. Open Source, Free Software) and not so technical (e.g. flat structure, independence) points of view.

Those who know me better might very well think that I’ve never really moved on as I hinted in the title of the blog post I wrote years ago, and in some way that’s perhaps not entirely wrong, since it’s no secret I always kept in touch throughout these past years at many levels and that I always felt enormously proud of my time as an Igalian. Emmanuele even told me that I sometimes enter what he seems to call an “Igalia mode” when I speak of my past time in there, as if I was still there… Of course, I haven’t seen any formal evidence of such thing happening yet, but it certainly does sound like a possibility as it’s true I easily get carried away when Igalia comes to my mind, maybe as a mix of nostalgia, pride, good memories… those sort of things. I suppose he’s got a point after all…

So, I guess it’s only natural that I finally decided to apply again since, even though both the company and me have evolved quite a bit during these years, the core foundations and principles it’s based upon remain the same, and I still very much align with them. But applying was only one part, so I couldn’t finish this blog post without stating how grateful I am for having been granted this second opportunity to join Igalia once again because, being honest, more often than less I was worried on whether I would be “good enough” for the Igalia of 2018. And the truth is that I won’t know for real until I actually start working and stay in the company for a while, but knowing that both my former colleagues and newer Igalians who joined since I left trust me enough to join is all I need for now, and I couldn’t be more excited nor happier about it.

Anyway, this post is already too long and I think I’ve covered everything I wanted to mention On Moving (pun intended with my post from 2012, thanks Will Thompson for the idea!), so I think I’ll stop right here and re-focus on the latest bits related to the relocation before we effectively leave the UK for good, now that we finally left our rented house and put all our stuff in a removals van. After that, I expect a few days of crazy unpacking and bureaucracy to properly settle in Galicia and then hopefully a few weeks to rest and get our batteries recharged for our new adventure, starting soon in September (yet not too soon!).

As usual, we have no clue of how future will be, but we have a good feeling about this thing of moving back home in multiple ways, so I believe we’ll be fine as long as we stick together as a family as we always did so far.

But in any case, please wish us good luck.That’s always welcome! 🙂

01 de August de 2018

HackIt, SolveIt and SmashCTF (III) – HTML5 DRM – Conflicto ideológico


DRM y HTML5. EME (Encrypted Media Extensions). Hay que empaparse algo sobre estos temas para resolver el nivel. EME ofrece un API que permite a las aplicaciones web interactuar con sistemas de protección de contenido para poder reproducir audio o video cifrado. El famoso DRM en HTML5, algo que muchos consideran una aberración (la web nació para ser abierta, no para ofrecer contenidos cerrados). Pero… ahí está el API. Y es precisamente lo que hay que intentar resolver. Básicamente el cliente tiene una etiqueta video. Al pulsar el play se visualizan 26 segundos. Pero a partir de ahí, todo está negro. Parece que el video webm está protegido. En el código vemos que en un momento dado se hace una petición de licencia a un servidor license, que nos envía la clave para desproteger el webm.

Pero esa petición sólo se puede hacer si rellenamos los bytes que faltan… esos bytes forman parte de la solución al sudoku que nos han puesto debajo del vídeo. ¿Qué hacer cuando tengamos la clave de desprotección del vídeo? Visualizarlo en el navegador 🙂 ¿Y después? Bueno, eso lo veremos enseguida… Vayamos por partes. Lo primero es solucionar el sudoku. Lo siguiente es automatizar el proceso de meter los números en las casillas del sudoku (hacerlo a mano es un infierno).
Solucionar el sudoku es fácil. Entramos en sudoku-solutions.com, metemos los datos y pulsamos en check…
Vaya, tiene 9 soluciones posibles. No podía ser tan fácil …

Para no perder tiempo tecleando cada una de ellas, podemos automatizar el proceso. Abrimos la consola JavaScript y tecleamos:

var sudoku = $("#su input")
var s ="852931647473862159961547283318476925549328761726159834637294518194685372285713496"
for (var i = 0; i < sudoku.length; i++){ sudoku[i].value = s.charAt(i); }
$("video")[0].needs_reload = 1;

Por cierto, en ese código ya va la solución correcta 🙂 La última línea informa al navegador que el sudoku ha cambiado y debe leer sus datos. Bien, todo preparado. Pulsamos play y vemos que pasamos del segundo 26. Es un trailer de «Inception». Hay una serie de fotogramas que muestran pixels distorsionados. Seguramente porque se haya introducido por ahí algún string que no debería estar… Habrá que bajar el webm, descifrarlo y abrirlo más o menos por esa parte, para ver de qué string se trata.

¿Pero cómo obtenemos la clave de descodificación del webm? (el navegador la conoce, pero necesitamos aislarla…) ¿Por cierto, cuántas claves habrá? Vamos allá.

Abrimos main.js y metemos un punto de ruptura en la línea 71

}).then(function(e) {
                var n = (e = new Uint8Array(e)).slice(0, 12);
                return window.crypto.subtle.decrypt({
                    name: "AES-GCM",
                    iv: n,
                    tagLength: 128
                }, r, e.slice(12))
            }).then(function(e) {
breakpoint --->                return t.target.update(e)
            })

En e tendremos la clave. Ojo, veremos que el breakpoint se ejecuta dos veces, y necesitaremos apuntar ambas claves (una es para cifrar el vídeo y otra para cifrar el audio). Creo recordar que no era «tan sencillo», sino que había que convertir el formato de las claves obtenidas en «e» con una línea como

atob(String.fromCharCode.apply(null, new Uint8Array(e)))

y a continuación extraer las claves de 16 bytes con un script como el siguiente (una de las claves era w-UHS…):

var b64string = "w-UHS56ogAQacZLNj1TpqA" ;
var buf = Buffer.from(b64string, 'base64');
var fs = require('fs');
fs.writeFile("key.txt", buf,  "binary",function(err) {
    if(err) {
        console.log(err);
    } else {
        console.log("The file was saved!");
    }
});

Momento de descargar el vídeo (cifrado) y descifrar. ¿Cómo desciframos? Bien, sabemos la clave y tenemos el vídeo cifrado. Nos falta saber cómo se cifró. Investigando un poco, nos encontramos con la utilidad webm_crypt de las webm-tools. Tiene una dependencia con libwebm, pero siguiendo las instrucciones de compilación del anterior enlace, lo podremos obtener sin problemas (en Linux, en macOS no iba).

Desciframos con :

$ webm_crypt -i input.webm -o decrypted.webm -decrypt -audio_options base_file=clave1 -video_options base_file=clave2

Y por fin, podremos abrir el fichero decrypted.webm (por ejemplo, con vlc…)

o con strings (!)

$ strings -n 10 decrypted.webm

(Nota: -n 10 = dame los strings ASCII de decrypted.webm que puedas visualizar, siempre y cuando esos strings sean de longitud mayor o igual a 10)

Y analizando la salida de strings, veremos la clave para el siguiente nivel 🙂

PD: creo que hay una herramienta que te permite pasar como input un vídeo webm y una marca de tiempo (hh:mm:ss) y te da como salida el contenido del fotograma de esa marca de tiempo. Lo cual te evitaría el uso de strings (o lo facilitaría). Pero eso lo dejo para que la gente de W0pr, navarparty o Barcelona92 nos lo cuenten en los comentarios.

30 de July de 2018

HackIt, SolveIt and SmashCTF (II). カッター注意

Nos pasan una captura .pcap que abrimos con Wireshark. Vemos en los primeros paquetes UDP que el dispositivo 10.10.0.78 se está comunicando con el 10.10.0.70.

Según la MAC, el .78 ha sido fabricado por CASIO (lo que cuadra con la pista del level). Se trata de un tagger (según los strings de los primeros paquetes), y parece que el modelo es un MEP r2 (?). Buscando Casio Tagger MEP en Google, sale una pequeña impresora o etiquetadora. Si buscamos en el Twitter de marcan algo relacionado con las palabras clave:»casio marcan42 twitter»… Bingo!

Es más, el tweet en cuestión apunta a un Gist con un programa en Python que implementa el protocolo de la impresora para poder convertir, enviarle e imprimir imágenes a partir de ficheros.

Leyéndolo mientras tenemos el pcap abierto, vemos que inicialmente se realiza un primer intercambio de mensajes de protocolo (donde se especifica, entre otras cosas la altura y anchura de la imagen)

>>> import struct; struct.unpack("<HHxx", "\x70\x01\x00\x17\x00\x00")
(368, 5888)

Hay 368 filas * 16 columnas de bytes. Por tanto 16×8 = 128 bit en cada fila.

Los datos de la imagen en sí comienzan a enviarse a partir del paquete 43 (todos aquellos con un payload de 512 bytes). Ya tenemos todo preparado para extraer los datos que forman la imagen.

Si ejecutamos el script:

python leer.py > payload.bin

Podremos resolver directamente con:

xxd -b -c 16 payload.bin

Pero ya que tenemos el código para hacer el encoding de una imagen a 1’s y 0’s, podemos invertir el proceso fácilmente:

Y resolver 🙂

PD: カッター注意 = Cutter Attention ? (algo así como, «cuidado con la cuchilla» – que corta los trozos de papel de la impresora?)

29 de July de 2018

HackIt, SolveIt and SmashCTF: aprender es un regalo (I)

Como todos los años, desde la Euskal Encounter 7 (antes Euskal Party), acudimos a la cita. La cuestión no es sólo intentar ganar (que también), sino aprender. Y aunque cada vez conocemos más técnicas, herramientas y teoría de nuestra área, en muchas ocasiones, tras pelearme con las pruebas del HackIt y SolveIt, me da la sensación de que mis lagunas de saber son también cada vez mayores. ¿Qué sé de la codificación Manchester? ¿Qué demonios es Toslink S/PDIF? Por no hablar de conocimientos olvidados hace muuucho tiempo (¿cuál es la estructura química de la aspirina?). Pero…

«El éxito es aprender a ir de fracaso en fracaso sin desesperarse (Winston Churchill)»

Así que, cada Euskal, los incombustibles de DiarioLinux (tal vez el año que viene seamos Ikasten.io, ahora que está de moda cambiarse de nombre, ¿eh Failrz? :), volvemos a intentarlo.

En el HackIt de 2018 quedamos en tercera posición, pero conseguimos resolver tres pruebas (y la tercera, tras mucho dolor, a eso de las 5am del sábado). En el SolveIt no nos solemos obcecar tanto, y este año, sólo pudimos resolver la primera prueba. Un FAIL doloroso, sí.

«Aprender es un regalo, aunque a veces el maestro sea el dolor» (Anónimo)

Nos gusta comentar las pruebas una vez que termina el deadline. Las locuras (trolleos) de marcan y cómo nos hemos buscado la vida para resolverlas. En esta edición, hubo un debriefing con las tres secciones (HackIt, SolveIt, SmashCTF) a las 00:00 del domingo (justo tras el deadline) y ahí, cuando aún tienes frescas las ideas que has ido probando durante unas 60 horas -y tienes una terrible curiosidad -«jakin-mina»- por conocer la respuesta de ciertos retos que no te han dejado dormir, es cuando realmente prestas atención a las explicaciones. Cuando tienes sed por aprender. Y cuando aplaudes los esfuerzos de otros grupos por superar el dolor – sueño, cabreos y dolores de cabeza 🙂 de intentar avanzar una prueba más, de leer con más detalle una especificación, de entender hasta el más mínimo detalle un binario de una plataforma arcaica… Creo que ese esfuerzo, esa motivación por mejorar, es lo que me contagia y hace que, año tras año, deje todo por estar ahí.

Y sin más preámbulos ( ni citas ! 🙂 , vayamos con la chicha. Prueba 1 del HackIt.

Simple código en JavaScript que opera con los valores introducidos en el input, realizando 10 comparaciones (2 por cada vuelta, 5 vueltas) para confirmar que cumplen los criterios que lo validan como contraseña. La variable c lleva el contador de condiciones que se cumplen. Lo único que despista inicialmente es que suma dos booleanos ( condición booleana + condición booleana). La idea es que false se evalúa como 0 y true como 1.

Aunque inicialmente pensamos en resolver el level de manera ortodoxa (resolver el sistema de ecuaciones), al final consideramos mucho más rápido usar fuerza bruta – pulsa sobre la imagen para ampliar – 🙂


Y con un console.log(solucion), pasamos al siguiente nivel…

26 de July de 2018

Running EPF (Eclipse Process Framework) Composer in Linux, v2

From time to time I pretend to do practical work with EPF Composer and alwys my first handicap is to get it running natively in Linux but for one reason or another I always got stucked with the provided linux build.

This time I chose a different approach using the Windows build which it’s known it’s working using Wine as a compatibility layer. It worked. These are the installing instructions:

First we’ll create a separate Wine folder and, important, set to Win32 architecture, used in the EPF Composer build_:

export WINEPREFIX=~/.wine-epf-1.5.2
export WINEARCH=win32

Now we install Internet Explorer 8 using the nice winetricks tool since EPF Composer uses it for the HTML forms:

winetricks winxp ie8 corefonts

Install a Java runtime. I’ve only tried with te Oracle one:

wine jre-8u181-windows-i586.exe

In the Wine filesystem unzip the EPF Composer build:

cd $WINEPREFIX/drive_c/Program\ Files/
unzip ~/epf-composer-1.5.2-win32.zip 
chmod a+x cd $WINEPREFIX/drive_c/Program\ Files/epf-composer/epf.exe

And now you should be able to run EPF Composer:

$WINEPREFIX/drive_c/Program\ Files/epf-composer/epf.exe 

You can launch it from other terminal without declaring the Wine enviroment variables:

WINEPREFIX=~/.wine-epf-1.5.2 WINEARCH=win32 ~/.wine-epf-1.5.2//drive_c/Program\ Files/epf-composer/epf.exe 

And that’s it.

I didn’t started to do real work yet but it seems 100% operative.

As I’m using a 14” HD screen My only important concern are the font sizes. For some reasons changing resolution and font sizes with winecfg gaves some weird results.

Hope this helps.

PD: Added winxp and corefonts to the winetricks invocation

25 de June de 2018

Abierto el registro para GUADEC 2018 en Almería

cartel de GUADEC 2018

Tenemos el inmenso placer de anunciar que ya está abierto el registro para GUADEC 2018 que tendrá lugar los días 6-11 de julio en nuestra ciudad de Almería. El encuentro incluye tres días de conferencias, 3 días de talleres y encuentros técnicos y varias actividades sociales para los miembros de la comunidad de desarrollo GNOME.

emblema de GUADEC 2018

Acerca de GUADEC

El Congreso de Usuarios y Desarrolladores de GNOME (GUADEC) es una reunión anual de desarrolladores y entusiastas de GNOME, así como de usuarios individuales, empresariales, educacionales y gubernamentales de todo el mundo. Proporciona un foro para miembros del proyecto GNOME para mostrar su trabajo y discutir acerca del futuro desarrollo de GNOME. GUADEC también es el hogar de iluminados del software libre, líderes de IT de alto nivel gubernamentales y empresariales que discuten estrategias, opciones de despliegue y el futuro del software libre. Cada año la GUADEC se aloja en un país europeo distinto.

¿Quién puede venir a GUADEC?

GUADEC es el mayor encuentro anual internacional de la comunidad GNOME, pero también es un punto de encuentro de entusiastas y tecnólogos apasionados por el desarrollo de software en comunidades abiertas, del software para usuarios finales y los interesados en la creación y mantenimiento de infraestructuras de la sociedad de la información ampliando los procomunes digitales.

GUADEC 2017, Manchester

No dejes de venir.

Más información en:

Logo de GNOME

Acerca de GNOME

GNOME es un entorno de escritorio e infraestructura de desarrollo para sistemas operativos GNU/Linux, Unix y derivados Unix como BSD o Solaris; compuesto enteramente de software libre. Con una base de millones de usuarios en todo el mundo y adatpado a 166 idiomas y está disponible en las principales distribuciones GNU/Linux, incluyendo Fedora, Debian, Ubuntu, Red Hat Linux, CentOS, Oracle Linux, Arch Linux y Gentoo.

17 de May de 2018

Performance hackfest

Last evening I came back from the GNOME performance hackfest happening in Cambridge. There was plenty of activity, clear skies, and pub evenings. Here’s some incomplete and unordered items, just the ones I could do/remember/witness/talk/overhear:

  • Xwayland 1.20 seems to be a big battery saver. Christian Kellner noticed that X11 Firefox playing Youtube could take his laptop to >20W consumption, traced to fairly intensive GPU activity. One of the first things we did was trying master, which dropped power draw to 8-9W. We presumed this was due to the implementation of the Present extension.
  • I was looking into dropping the gnome-shell usage of AtspiEventListener for the OSK, It is really taxing on CPU usage (even if the events we want are a minuscule subset, gnome-shell will forever get all that D-Bus traffic, and a11y is massively verbose), plus it slowly but steadily leaks memory.

    For the other remaining path I started looking into at least being able to deinitialize it. The leak deserves investigation, but I thought my time could be better invested on other things than learning yet another codebase.

  • Jonas Ådahl and Christian Hergert worked towards having Mutter dump detailed per-frame information, and Sysprof able to visualize it. This is quite exciting as all classic options just let us know where do we spend time overall, but doesn’t let us know whether we missed the frame mark, nor why precisely would that be. Update: I’ve been pointed out that Eric Anholt also worked on GPU perf events in mesa/vc4, so this info could also be visualized through sysprof
  • Peter Robinson and Marco Trevisan run into some unexpected trouble when booting GNOME in an ARM board with no input devices whatsoever. I helped a bit with debugging and ideas, Marco did some patches to neatly handle this situation.
  • Hans de Goede did some nice progress towards having the GDM session consume as little as possible while switched away from it.
  • Some patch review went on, Jonas/Marco/me spent some time looking at a screen very close and discussing the mipmapping optimizations from Daniel Van Vugt.
  • I worked towards fixing the reported artifact from my patches to aggressively cache paint volumes. These are basically one-off cases where individual ClutterActors break the invariants that would make caching possible.
  • Christian Kellner picked up my idea of performing pointer picking purely on the CPU side when the stage purely consists of 2D actors, instead of using the usual GL approach of “repaint in distinctive colors, read pixel to perform hit detection” which is certainly necessary for 3D, but quite a big roundtrip for 2D.
  • Alberto Ruiz and Richard Hughes talked about how to improve gnome-software memory usage in the background.
  • Alberto and me briefly dabbled with the idea of having specific search provider API that were more tied to Tracker, in order to ease the many context switches triggered by overview search.
  • On the train ride back, I unstashed and continued work on a WIP tracker-miners patch to have tracker-extract able to shutdown on inactivity. One less daemon to have usually running.

Overall, it was a nice and productive event. IMO having people with good knowledge both deep in the stack and wide in GNOME was determining, I hope we can repeat this feat again soon!