Air Mozilla: German speaking community bi-weekly meeting |
Zweiw"ochentliches Meeting der deutschsprachigen Community. ==== German speaking community bi-weekly meeting.
https://air.mozilla.org/german-speaking-community-bi-weekly-meeting-20150326/
|
Emma Irwin: Opensource.com Article on Mozilla Community Education |
Super excited to share my post published on opensource.com for Open Education Week: “Mozilla cares for community with educational resources“.
http://tiptoes.ca/opensource-com-article-on-mozilla-community-education/
|
Mozilla Reps Community: New Rep Mentors, welcome! |
Dear Reps Planet,
The council is excited to share with you, our second group of new Mozilla Rep Mentors this Year.
These are Reps council has recognized as being equally good at inspiring and empowering others, as they are leading globally and locally in their communities.
As mentorship is core to the program, we are very grateful they have agreed to take on this new responsibility.
A crucial role in the Mozilla Reps ecosystem is that of a mentor. We strive for every Rep to become a mentor for the program to become self-sustaining and for Reps to play a central role in our ambitious goals for growing and enabling the Mozilla Community. We’ve just accepted eight new mentors, bringing the current total to 54.
Our new mentors are:
Please join us in congratulating our new Mozilla Rep Mentors – via this thread on Discourse
Reps Mentor Role Description:
Welcome New Reps Mentors!
https://blog.mozilla.org/mozillareps/2015/03/26/new-rep-mentors-welcome/
|
Air Mozilla: Community Education Call |
The Community Education Working Group exists to merge ideas, opportunities, efforts and impact across the entire project through Education & Training.
|
Monty Montgomery: Daala Blog-Like Update: Bug or feature? [or, the law of Unintentionally Intentional Behaviors] |
Codec development is often an exercise in tracking down examples of "that's funny... why is it doing that?" The usual hope is that unexpected behaviors spring from a simple bug, and finding bugs is like finding free performance. Fix the bug, and things usually work better.
Often, though, hunting down the 'bug' is a frustrating exercise in finding that the code is not misbehaving at all; it's functioning exactly as designed. Then the question becomes a thornier issue of determining if the design is broken, and if so, how to fix it. If it's fixable. And the fix is worth it.
|
Matt Thompson: Mozilla Foundation March 2015 Board Meeting |
Mozilla-wide goals: grow long-term relationships that
|
Gervase Markham: Top 50 DOS Problems Solved: Shrinking Hard Disk |
Q: My hard disk seems to be getting smaller! There is a megabyte less free space than there was a month ago, yet I have not saved anywhere near 1MB’s worth of files. What’s going on?
A: This is quite a common problem, but most sufferers don’t realise they’ve got it. What happens is that some of the free space gets allocated to a non-existent file.
In other words the disk filing system has, in your case, a megabyte allocated to one or more files that don’t have a directory entry. They cannot therefore be seen with the DIR command, nor deleted.
Fortunately it is possible to turn these lost chains, as they are called, back into real files which can then be seen and deleted in the normal way. Simply type this command:
CHKDSK /F
If you have any lost chains, Chkdsk will tell you so and ask you if you want to convert them into files. Answer ‘Y’.
FILE0000.CHK, FILE0001.CHK, FILE0002.CHK…
http://feedproxy.google.com/~r/HackingForChrist/~3/ZeJ8Qps8pLI/
|
Mozilla Science Lab: Ask Us Anything: Lessons from the Local Research Coding Groups Panel Discussion |
On Tuesday, the the Science Lab conducted its first Ask Us Anything on the MSL Forum, organized by Noam Ross. The topic was lessons learned in running local study groups, user’s groups, hacky hours and other meetups for researchers using and writing code; many thanks go out to the seven panelists who were available to answer questions:
This was a tremendously successful event, with a sustained conversation of more than a post per minute on the topic for two full hours; a lot of interesting ideas came out of the discussion, a few of which I summarize here, followed by detailed discussion below; also, be sure to check out the full thread.
A few great ideas for study groups can be distilled from this event:
And for online AMA-style events:
One of the first things the thread discussed, was one of the most common problems for any regular community event: sustainability. How do we get people coming out and participating, month after month, and maintain momentum?
The panel quickly zeroed in on an interesting challenge: will interest be inspired and sustained by highly targeted skills and tools trainings, or will keeping things as general as possible appeal to a wide audience? Highly specific material will be the most attractive to the small group of people already interested in it, while general topics might seem vague or unclear to a potential attendee on how relevant they’ll be, even if they, in principle, apply to a wider range of people.
This led to an important observation: the bigger the pool of people a study group is communicating with, the better its attendance will be. Panelists seemed to have a bit more success with the specific and clearly practically applicable; what allowed these groups to keep attendance up despite getting into the nitty gritty, was developing a large audience of people aware of their activities. Numbers seemed to hover around 10% attendance, if we compare number of actual attendees to size of mailing lists; but with a large audience (critical mass seemed to be around 200 people), there’s sure to be a cohort of people interested in whichever specific topic the group wants to take up.
But what about the early days, before a new group has gotten in front of that first 200? Fiona and Jeff made a key observation: stick to it, even if the first couple of events are just you and one or two other people. It takes time for word of mouth to spread, time for people to make up their minds that they’re comfortable dipping their toe into something like a meetup group – and, worst case, you’ve set aside some time to get some of your own work done and have a beer.
Finally on the topic of sustainability, another common concern that came up was the relationship of organizers to the host institution; post-docs and students move on after only a few short years, and without someone to pick up the torch, efforts can fizzle out. The panel agreed that it’s crucial for senior organizers to cultivate relationships with people who they can hand off to in future, but this calls out another key design question: how can we design a really smooth hand-off procedure between generations of organizers? This is a long term goal a bit beyond the concerns of groups just getting started, but I think with some savvy design, this process can be made quite smooth; more of my own ideas on this will be forthcoming on this blog very soon.
We need that pool of 200 people thinking about our event – how do we assemble them to begin with?
Organizers found, perhaps surprisingly, that their attendees were pretty quiet on Twitter, and didn’t generate much conversation there, although Twitter might be more effective as a ‘push’ platform, to let people know about events and content. More successful were blogs and mailing lists; panelists cited the familiarity of these formats to most researchers.
A novel approach that a few of the groups based at universities took, was to approach departments for inclusion in departmental news letters and welcome packages for new students. Not only do these communication channels typically already exist in most institutions, they can put a group in front of a large number of potentially interested people quickly, and lend a degree of inclusion into the establishment that helps catch peoples’ attention.
One thing I love about getting a bunch of people together to talk, is that novel ideas always come out. One of my favorites was a whole other flavor of event that a study group could put on; Fiona Tweedie described ‘Research Speed Dating’, an event where a bunch of people set up short demos of tools they use in their research, and attendees circulate among them, exploring the tools in short, five-minute introductions to see if they might be interested in looking deeper into them at a future meetup. Topics that garner a lot of interest are chosen for deeper dives at future events, and prospective participants get to meet organizers and start developing connections in a relatively no-pressure atmosphere.
Another observation I found compelling from the discussion came from Rayna Harris – graduate school often involves working on the same project for years, and the singular focus can be maddening after a while. It’s really refreshing to have a project that comes in little, month-long bites; from announcing a meetup to delivering can easily occupy only a few weeks, giving a sense of delivery and completion on a much faster cadence than research naturally provides.
A number of people also asked me about the AMA format itself; I think it was a big success, and it was largely thanks to some design decisions Noam Ross made when we were setting this event up:
A remarkable thing about this event, was that the same sort of skill and knowledge sharing that happens so naturally at a conference and that I’ve been trying to produce online came out in this event; by sitting a half dozen people down around a topic in a finite time window (we did two hours and it didn’t drag at all), the same sort of connections and mutual understanding came out.
A number of interesting ideas, metrics and goals for study groups came out of this conversation, which we’ll be folding in to our forthcoming support for setting up your own meetup – watch this space for news and opportunities in that project coming very soon, and in the meantime, make sure your local study group is on the map!
Given what a great time and what a productive discussion everyone had on the forum on Tuesday, I’m looking forward to making these panel AMAs a regular event at the Lab; if you have a topic you’d like to suggest, post it in the Events section of the forum, or tweet it to us at @MozillaScience and @billdoesphysics. I hope you’ll join us!
|
Mozilla Release Management Team: Firefox 37 beta7 to rc |
Due to the short cycle (5 weeks instead of 6), we landed more changes than we used to in the RC build.
We took some stability fixes for graphic issues.
Extension | Occurrences |
h | 17 |
cpp | 17 |
ini | 7 |
py | 3 |
list | 2 |
js | 2 |
html | 2 |
sh | 1 |
json | 1 |
ipdlh | 1 |
hgtags | 1 |
build | 1 |
Module | Occurrences |
storage | 17 |
dom | 14 |
mobile | 12 |
gfx | 8 |
widget | 4 |
testing | 3 |
layout | 3 |
docshell | 2 |
security | 1 |
editor | 1 |
browser | 1 |
List of changesets:
Chris Manchester | Bug 1145444. r=jmaher, a=test-only - 1efc8c39543c |
Jeff Gilbert | Bug 1143218 - Use mochitest subsuites to specify webgl tests. r=jmaher, r=gbrown, a=test-only - a58b8b594396 |
Kyle Huey | Bug 1145870. r=bz, a=lmandel - 0725e4cfa3c3 |
Cykesiopka | Bug 1121117 - Add fuzz time to workaround non-monotonicity of Date(). r=keeler, a=test-only - 8358c6c2c417 |
Tim Taubert | Bug 1088163 - Fix intermittent browser_offlineQuotaNotification.js timeouts by properly waiting for a notification to show. r=markh, a=test-only - 72912a71fb98 |
Ehsan Akhgari | Bug 1142360 - Move the mochitests for bugs 441782, 467672 and 570378 to the reftest framework. r=dbaron, a=test-only - 62a72d33d16b |
Neil Deakin | Bug 942411 - Change the frame height to force a reflow and renable the test on Linux to see if it helps. r=smaug, a=test-only - b8ec30b0a437 |
James Willcox | Bug 1090300 - Repopulate input buffers when necessary in Android media decoder. r=gcp, a=lmandel - 2cca5b090036 |
Ryan VanderMeulen | Bug 1146061 - Skip test_peerConnection_basicH264Video.html on Windows debug. a=test-only - 19b630388dda |
Ryan VanderMeulen | Backed out changeset 72912a71fb98 (Bug 1088163) because it depends on BrowserTestUtils, which isn't available on 37. - 196c6575593d |
Matt Woodrow | Backed out changeset 0c23dcbc6bf7 (Bug 1138967) for causing crashes - 6d7a2555b021 |
Matt Woodrow | Backed out changeset 0c23dcbc6bf7 (Bug 1138967) for causing crashes. CLOSED TREE - 2592523e1eb0 |
Olli Pettay | Bug 1146339 - Do anchor scrolling right before dispatching popstate/hashchange. r=bz, a=lmandel - 4d306a83ae1b |
Marco Bonardo | Bug 1005991 - mozStorage should not use XPCVariant off the main thread. r=asuth, a=lmandel - b8c1a399905d |
Marco Bonardo | Bug 1005991 - Trivial fixes for non-unified builds. r=me, a=lmandel - fadc9f270e9f |
Ryan VanderMeulen | Merge beta to m-r. a=merge - 07c827be741f |
Steven Michaud | Bug 1137229 - Keyboard input can stop working in a window. r=smaug a=lmandel CLOSED TREE - 45961b7d67dc |
Shih-Chiang Chien | Bug 1080130 - Unreferenced socket might be closed before opened. r=khuey, a=test-only - b6a4dca0edc9 |
Jeff Muizelaar | Bug 1137716 - Try blacklisting Optimus w/ Intel Ironlake Graphics. r=bas, a=lmandel - d56b6d648c01 |
Matt Woodrow | Bug 1145585 - Hold a ref to the right texture. r=jmuizelaar, a=lmandel - e35deaa85d21 |
Ehsan Akhgari | Bug 1146883 - Null check the node passed to GetGoodSelPointForNode. r=smaug, a=lmandel - 8fda35675a3f |
Ryan VanderMeulen | Merge beta to m-r. a=merge - 7ec23d08cf32 |
http://release.mozilla.org/statistics/37/2015/03/26/fx-37-b7-to-rc.html
|
Karl Dubost: Refresh HTTP Header |
Through discussions on whatwg, I learned (or I had just forgotten) about the Refresh
HTTP header. Let's cut strait to the syntax:
HTTP/1.1 200 OK Refresh: 5; url=http://www.example.org/fresh-as-a-summer-breeze
where
5
means here 5 seconds.url=
gives the destination where the client should head after 5 seconds.Simon Pieters (Opera) is saying in that mail:
I think Refresh as an HTTP header is not specified anywhere, so per spec
it shouldn't work. However I think browsers all support it, so it would be
good to specify it.
Eric Law (ex-Microsoft) has written about The Performance Impact of META REFRESH. If we express the previous HTTP header in HTML, we get:
span> http-equiv="refresh" content="5;url=http://www.example.org/fresh-as-a-summer-breeze" />
In his blog post, Eric is talking about people using refresh
to… well refresh the page. He means loading the same exact page over and over again. And indeed it means for the browser to create a certain number of "unconditional and conditional HTTP requests to revalidate the page’s resources" for each reload (refresh).
On the Web Compatibility side of things, I see the used quite often.
span> http-equiv="refresh" content="0;url=http://example.com/there" />
Note the 0
. Probably the result of sysadmins not willing to touch the configuration of the servers, and so front-end developers taking the lead to "fix it", instead of using HTTP 302
or HTTP 301
. Anyway, it is something which is being used for most of the time, redirecting to another domain name or uri. Refresh
HTTP Header on the other hand, I don't remember seeing it that often.
Simon is saying: "it would be good to specify it." I'm not so sure. First things first.
Let's create a test, by making a page sending a Refresh
.
Header set Refresh "0;url=https://www.youtube.com/watch?v=sTJ1XwGDcA4"
which gives
HTTP/1.1 200 OK Accept-Ranges: bytes Connection: Keep-Alive Content-Length: 200 Content-Type: text/html; charset=utf-8 Date: Thu, 26 Mar 2015 05:48:57 GMT ETag: "c8-5122a67ec0240" Expires: Thu, 02 Apr 2015 05:48:57 GMT Keep-Alive: timeout=5, max=100 Last-Modified: Thu, 26 Mar 2015 05:37:05 GMT Refresh: 0;url=https://www.youtube.com/watch?v=sTJ1XwGDcA4
This should redirect to this Fresh page
If someone could test for IE and Chrome at least.
On Mozilla bug tracker, there are a certain number of bugs around refresh. This bug about inline resources is quite interesting and might indeed need to be addressed if there was a documentation. The bug is what the browser should do when the Refresh
HTTP header is on an image included in a Web page (this could be another test). For now, the refresh is not done for inline resources. Then what about scripts, stylesheets, JSON files, HTML document in iframes, etc? For the SetupRefreshURIFromHeader
code, there are Web Compatibility hacks in the source code of Firefox. We can read:
// Also note that the seconds and URL separator can be either // a ';' or a ','. The ',' separator should be illegal but CNN // is using it."
also:
// Note that URI should start with "url=" but we allow omission
and… spaces!
// We've had at least one whitespace so tolerate the mistake // and drop through. // e.g. content="10 foo"
Good times…
On Webkit bug tracker, I found another couple of bugs but about meta refresh
and not specifically Refresh:
. But I'm not sure it's handled by WebCore or if it's handled elsewhere in MacOSX (NSURLRequest
, NSURLConnection
, …). If someone knows, tell me. I didn't explore yet the source code.
On Chromium bug tracker, another couple of bugs for meta refresh
, with some interesting such as this person complaining that a space doesn't work instead of a ;
. This is also tracked on WebKit. Something like:
span> http-equiv="refresh" content="0 url=http://example.com/there" />
Also what should be done with a relative URL.
span> http-equiv="refresh" content="0;url=/there" />
But for Chromium, I have not found anything really specific to Refresh
header. I didn't explore yet the source code.
On Opera bug tracker, it is still closed. We tried to open it when I was working there, and it didn't work.
Then you can also imagine the hierarchy of commands in a case like this:
HTTP/1.1 301 Permanent Redirect Refresh: 0;url=http://example.net/refresh-header Location: http://example.net/location
My guess is the 301
always win with the Location
HTTP header, or at least it's what I hope.
I can find very early references of meta refresh
such as in Netscape Developer documentation.
The earliest mention seems to be An Exploration Of Dynamic Documents I can't find anywhere the documentation for Refresh
HTTP header on old Netscape Web sites. (Thanks to SecuriTeam Web site and Amit Klein)
So another thing you obviously want to do, in addition to causing the current document to reload, is to cause another document to be reloaded in n seconds in place of the current document. This is easy. The HTTP response header will look like this:
Refresh: 12; URL=http://foo.bar/blatz.html
In June 1996, Jerry Jongerius posted about HTTP/1.1 Refresh header field comments
My concern with "Refresh" is that I do not want it to be a global concept (a browser can only keep track of one refresh)--it looks to be implemented this way in Netscape 2.x. I would like "Refresh" to apply to individual objects (RE: the message below to netscape).
which Roy T. Fielding replied to:
Refresh is no longer in the HTTP/1.1 document -- it has been deferred to HTTP/1.2 (or later).
Should it be documented? Well, there are plenty of issues, there are plenty of hacks around it. I have just touched the surface of it. Maybe it would be worth to document indeed how it is working as implemented now and how it is supposed to be working when there's no interoperability. If I was silly enough, maybe I would do this. HTTP, Archeology and Web Compatibility issues that seems to be close enough from my vices.
Otsukare!
|
Robert O'Callahan: Paper Titles |
A few tips on computer science paper titles:
Titles of the form Catchy Project Name: What Our Project Is About are stilted. Show some imagination.
Titles of the form Towards [Some Goal We Totally Failed To Reach] are an obvious attempt to dress up failure as success. Don't do that.
Do write bold papers about negative results. Call your paper [Our Idea] Doesn't Work (And Here's Why) and I'll be excited to read it.
[Goal] Is Harder Than You Think would also get my attention.
If your paper title contains the word Aristotelian, I will never read your work again and skip the conference too --- but you get points for chutzpah.
Note: following this advice may harm your career. Consider a career where you don't have to publish or perish.
|
The Mozilla Blog: Please welcome Allison Banks, Vice President of People |
We’re thrilled to announce that Allison Banks is joining the leadership team at Mozilla today as our new Vice President of People.
As the leader of our global human resource team at Mozilla, Allison will be responsible, above all, for ensuring our people have what they need to help move our mission forward. Specifically, her team will develop and execute the people-related strategies and activities that will help to foster growth, innovation, and our overall organizational effectiveness.
With over 20 years of experience, Allison joins us most recently from GoPro where she served as Sr. Director of HR overseeing the hiring of 900 people, opening offices in seven countries, integrating acquisitions and building the HR processes and systems required to support a dynamic global organization. Prior to GoPro, she developed her HR expertise and track record for inspiring and supporting people at Perforce Software, Citibank, and Ingres.
Allison’s background, experience and passion for the human side of business is an exceptional fit for Mozilla.
She will be based in the Bay Area, working out of our Mozilla Space in San Francisco and our headquarters in Mountain View.
Please join me in welcoming Allison to Mozilla!
chris
Background:
Allison Banks, Vice President of People, Mozilla
Bio & Mozillians profile
LinkedIn profile
High-res photo
https://blog.mozilla.org/blog/2015/03/25/please-welcome-allison-banks-vice-president-of-people/
|
Air Mozilla: Product Coordination Meeting |
Weekly coordination meeting for Firefox Desktop & Android product planning between Marketing/PR, Engineering, Release Scheduling, and Support.
https://air.mozilla.org/product-coordination-meeting-20150325/
|
Mozilla WebDev Community: Beer and Tell – March 2015 |
Once a month, web developers from across the Mozilla Project get together to design the most dangerous OSHA-compliant workstation possible. While searching for loopholes, we find time to talk about our side projects and drink, an occurrence we like to call “Beer and Tell”.
There’s a wiki page available with a list of the presenters, as well as links to their presentation materials. There’s also a recording available courtesy of Air Mozilla.
A certain blog post author was first with dxr-cmd, a command-line client for making queries to DXR, Mozilla’s source code browser. The tool is installed via pip and supports any query you can make via the web interface. Output can be run through a pager utility such as less, and you can also control the syntax highlighting applied to the output.
Next up was phrawzty, who was not present but shared a link to AudioAddict.bundle, a Plex plugin that allows you to play music from AudioAddict-based services (such as radiotunes.com, di.fm, and more).
peterbe shared Redunter, a web service that helps hunt down unused CSS on your website. By embedding a small snippet of JS into your page and browsing through your website, Redunter will analyze the HTML being rendered and compare it to the CSS being served. The end result is a list of CSS rules that did not match any HTML that was delivered to the user. Redunter even works with sites that modify the DOM by watching for mutation events and tracking the altered HTML.
ScottMichaud returns with more fun stuff using the WebCL extension! Scott shared a demo of WebCL-powered audio where a virtual microphone was surrounded by individual raindrop sounds. By controlling the rate of raindrops, you can simulate a higher audio load and see the difference that pushing audio processing to the GPU can make.
Senior Space Cadet lorchard shared Parsec Patrol, a vector-based space game for the web. While there’s no full game made yet, there is a webpage with several demos showing collision detection, spaceship navigation, missiles, point-defense systems, and more!
Have you ever seen an abbreviation like l10n or i18n and had no idea what it meant? Have no fear, Uncle Potch is here with a9r, the answer to the abbreviation problem! Simply install the command and enter in an abbreviation to receive a list of all words in the SOWPODS word list that match. Got a word that you need to abbreviate? Not only can a9r decipher abbreviations, it can create them!
In a slightly-less-whimsical vein, potch also shared socketpeer, a simple JavaScript library for 1:1 messaging via WebRTC Data Channels and WebSockets. Extracted from the Tanx demo that Mozilla showed at GDC 2015, socketpeer contains both a server API for establishing peer connections between users and a client API to handle the client-side communication. Potch also shared a demo of a peer-to-peer chat application using socketpeer.
Next up was cvan, who shared PhantomHAR, a PhantomJS and SlimerJS script that generates an HTTP Archive (or HAR) for a URL. A HAR is an archive of data about HTTP transactions that can be used to export detailed performance data for tools to consume and analyze, and PhantomHAR allows you to easily generate the HAR for use by these tools.
Next, cvan shared fetch-manifest, a small library that takes a URL, locates the W3C web app manifest for the page, fixes any relative URLs in the manifest, and returns it. This is useful for things like app marketplaces that want to allow people to submit web apps by submitting a single URL to the app they want to submit.
Last up was bwalker, who shared robot-threejs, an experimental steampunk robot game powered by three.js and WebGL. The game currently allows you to fly around a 3D environment that has 3D positional audio emitting from an incredibly mysterious cube. CAN YOU SOLVE THE CUBE MYSTERY?
This month we think we’ve really got something special with our Seki Edge keyboard-and-mouse combo. Order now and get a free box of Band-aids at no additional cost!
If you’re interested in attending the next Beer and Tell, sign up for the dev-webdev@lists.mozilla.org mailing list. An email is sent out a week beforehand with connection details. You could even add yourself to the wiki and show off your side-project!
See you next month!
https://blog.mozilla.org/webdev/2015/03/25/beer-and-tell-march-2015/
|
Air Mozilla: The Joy of Coding (mconley livehacks on Firefox) - Episode 7 |
Watch mconley livehack on Firefox Desktop bugs!
https://air.mozilla.org/the-joy-of-coding-mconley-livehacks-on-firefox-episode-7/
|
Air Mozilla: Bugzilla Development Meeting |
Help define, plan, design, and implement Bugzilla's future!
https://air.mozilla.org/bugzilla-development-meeting-20150325/
|
Advancing Content: Content Services Team Adds New Talent With Partnerships (and Mozilla) Experience |
Earlier this year I wrote about how 2015 will be a big year for Mozilla to scale and build better personalized experiences as we help move the ad industry forward. Today, I’m excited to announce two new additions to our Content Services team as we continue our mission to create innovative content offerings while always upholding Mozilla’s commitment to user privacy.
Accomplished interactive advertising expert Aaron Lasilla has joined Mozilla and our Content Services team as head of content partnerships. Aaron comes to us from EA Games where he served as the global director of brand solutions and co-founded the in-game advertising group. Aaron was instrumental in negotiating and securing a number of strategic partnerships for EA’s publishing division as he built the group it into a new business and revenue channel for EA, including the largest EA Online partnership ever (within Pogo.com’s casual games offering, in 2003). During his tenure, EA was established as the number one publisher of integrated advertising placements and partnership in and around games. Aaron previously managed Microsoft’s Premium Games Advertising offering and also worked in sales and sponsorship capacities for Double Fusion, Clear Channel Entertainment and Kemper Sports Marketing.
As we continue to develop and refine our new offerings like Firefox Tiles, Aaron will be focusing on engagement and value exchange for Mozilla’s offerings while maintaining the same quality and standards of user experience that Mozilla is known for.
In addition, I’m excited to formally announce that long-time Mozillian Patrick Finch joined our group late last year as director of marketing. Patrick has been with Mozilla for over seven years based out of Sweden and has worked in a number of strategic roles on Mozilla’s desktop and mobile projects over that time. Prior to joining Mozilla Patrick spent over ten years at Sun Microsystems in a variety of capacities including working on numerous open source projects.
As we continue the rollout of Firefox Tiles and bring on new partners, you’ll probably be seeing more of Aaron and Patrick on this blog. If you’re interested in partnering with us in our mission or if you’d just like to drop our team a line, feel free to reach out to us at contentservices@mozilla.com.
|
Francois Marier: Keeping up with noisy blog aggregators using PlanetFilter |
I follow a few blog aggregators (or "planets") and it's always a struggle to keep up with the amount of posts that some of these get. The best strategy I have found so far to is to filter them so that I remove the blogs I am not interested in, which is why I wrote PlanetFilter.
In my opinion, the first step in starting a new free software project should
be to look for a reason not to do it So I started by looking for
another approach and by asking people around me how they dealt with the
firehoses that are Planet Debian and
Planet Mozilla.
It seems like a lot of people choose to "randomly sample" planet feeds and only read a fraction of the posts that are sent through there. Personally however, I find there are a lot of authors whose posts I never want to miss so this option doesn't work for me.
A better option that other people have suggested is to avoid subscribing to the planet feeds, but rather to subscribe to each of the author feeds separately and prune them as you go. Unfortunately, this whitelist approach is a high maintenance one since planets constantly add and remove feeds. I decided that I wanted to follow a blacklist approach instead.
PlanetFilter is a local application that you can configure to fetch your favorite planets and filter the posts you see.
If you get it via Debian or
Ubuntu, it comes with a
cronjob that looks at all configuration files in /etc/planetfilter.d/
and
outputs filtered feeds in /var/cache/planetfilter/
.
You can either:
file:///var/cache/planetfilter/planetname.xml
to your local feed readerhttp://localhost/planetname.xml
) using a
webserver, orThe software will fetch new posts every hour and overwrite the local copy of each feed.
A basic configuration file looks like this:
[feed]
url = http://planet.debian.org/atom.xml
[blacklist]
There are currently two ways of filtering posts out. The main one is by author name:
[blacklist]
authors =
Alice Jones
John Doe
and the other one is by title:
[blacklist]
titles =
This week in review
Wednesday meeting for
In both cases, if a blog entry contains one of the blacklisted authors or titles, it will be discarded from the generated feed.
Since blog updates happen asynchronously in the background, they can work very well over Tor.
In order to set that up in the Debian version of planetfilter:
Set the following in /etc/polipo/config
:
proxyAddress = "127.0.0.1"
proxyPort = 8008
allowedClients = 127.0.0.1
allowedPorts = 1-65535
proxyName = "localhost"
cacheIsShared = false
socksParentProxy = "localhost:9050"
socksProxyType = socks5
chunkHighMark = 67108864
diskCacheRoot = ""
localDocumentRoot = ""
disableLocalInterface = true
disableConfiguration = true
dnsQueryIPv6 = no
dnsUseGethostbyname = yes
disableVia = true
censoredHeaders = from,accept-language,x-pad,link
censorReferer = maybe
Tell planetfilter to use the polipo proxy by adding the following to
/etc/default/planetfilter
:
export http_proxy="localhost:8008"
export https_proxy="localhost:8008"
The source code is available on repo.or.cz.
I've been using this for over a month and it's been working quite well for me. If you give it a go and run into any problems, please file a bug!
I'm also interested in any suggestions you may have.
http://feeding.cloud.geek.nz/posts/keeping-up-with-noisy-blog-aggregators-using-planetfilter/
|