-Поиск по дневнику

Поиск сообщений в rss_planet_mozilla

 -Подписка по e-mail

 

 -Постоянные читатели

 -Статистика

Статистика LiveInternet.ru: показано количество хитов и посетителей
Создан: 19.06.2007
Записей:
Комментариев:
Написано: 7

Planet Mozilla





Planet Mozilla - https://planet.mozilla.org/


Добавить любой RSS - источник (включая журнал LiveJournal) в свою ленту друзей вы можете на странице синдикации.

Исходная информация - http://planet.mozilla.org/.
Данный дневник сформирован из открытого RSS-источника по адресу http://planet.mozilla.org/rss20.xml, и дополняется в соответствии с дополнением данного источника. Он может не соответствовать содержимому оригинальной страницы. Трансляция создана автоматически по запросу читателей этой RSS ленты.
По всем вопросам о работе данного сервиса обращаться со страницы контактной информации.

[Обновить трансляцию]

Brett Gaylor: From Mozilla to new making

Суббота, 11 Октября 2014 г. 21:00 + в цитатник

Yesterday was my last day as an employee of the Mozilla Foundation. I’m leaving my position as VP, Webmaker to create an interactive web series about privacy and the economy of the web.

I’ve had the privilege of being a “crazy Mofo” for nearly five years. Starting in early 2010, I worked with David Humphrey and researchers at the Center for Development of Open Technology to create Popcorn.js. Having just completed “Rip!”, I was really interested in mashups - and Popcorn was a mashup of open web technology questions (how can we make video as elemental an element of the web as images or links?) and formal questions about documentary (what would a “web native” documentary look like? what can video do on the web that it can’t do on TV?). That mashup is one of the most exciting creative projects I’ve ever been involved with, and lead to a wonderful amount of unexpected innovation and opportunity. An award winning 3D documentary by a pioneer of web documentaries, the technological basis of a cohort of innovative(and fun) startups, and a kick ass video creation tool that was part of the DNA of Webmaker.org - which this year reached 200,000 users and facilitated the learning experience of over 127,200 learners face to face at our annual Maker Party.

Thinking about video and the web, and making things that aim to get the best of both mediums, is what brought me to Mozilla - and it’s what’s taking me to my next adventure.

I’m joining my friends at Upian in Paris (remotely, natch) to direct a multi-part web series around privacy, surveillance and the economy of the web. The project is called Do Not Track and it’s supported by the National Film Board of Canada, Arte, and Bayerischer Rundfunk (BR), the Tribeca Film Institute and the Centre National du Cin'ema. I’m thrilled by the creative challenge and humbled by the company I’ll be keeping - I’ve wanted to work with Upian since their seminal web documentary “Gaza/Sderot” and have been thrilled to watch from the sidelines as they’ve made Prison Valley, Alma, the MIT’s Moments of Innovation project, and the impressive amount of work they do for clients in France and around the world. These are some crazy mofos, and they know how to ship.

Fake it Till You Make it

Nobody knows what they’re doing. I can’t stress this enough.

— God (@TheTweetOfGod) September 20, 2014

Mozilla gave me a wonderful gift: to innovate on the web, to dream big, without asking permission to do so. To in fact internalize innovation as a personal responsibility. To hammer into me every day the belief that for the web to remain a public resource, the creativity of everyone needs to be brought to the effort. That those of us in positions of privilege have a responsibility to wake up every day trying to improve the network. It’s a calling that tends to attract really bright people, and it can illicit strong feelings of impostor syndrome for a clueless filmmaker. The gift Mozilla gave me is to witness first hand that even the most brilliant people, or especially the most brilliant people, are making it up every single day. That’s why the web remains as much an inspiration to me today as when I first touched it as a teenager. Even though smart people criticize sillicon valley’s hypercapitalism, or while governments are breeding cynicsm and misrust by using the network for surveillance, I still believe the web remains the best place to invent your future.

I’m very excited, and naturally a bit scared, to be making something new again. Prepare yourself - I’m going to make shit up. I’ll need your help.

Working With

source

“Where some people choose software projects in order to solve problems, I have taken to choosing projects that allow me to work with various people. I have given up the comfort of being an expert , and replaced it with a desire to be alongside my friends, or those with whom I would like to be friends, no matter where I find them. My history among this crowd begins with friendships, many of which continue to this day.

This way of working, where collegiality subsumes technology or tools, is central to my personal and professional work. Even looking back over the past two years, most of the work I’ve done is influenced by a deep desire to work with rather than on. ” - On Working With Instead of On

David Humphrey, who wrote that, is who I want to be when I grow up. I will miss daily interactions with him, and many others who know who they are, very much. "In the context of working with, technology once again becomes the craft I both teach and am taught, it is what we share with one another, the occasion for our time together, the introduction, but not the reason, for our friendship.”

Thank you, Mozilla, for a wonderful introduction. Till the next thing we make!

http://brettgaylor.tumblr.com/post/99738354192


Mozilla WebDev Community: Webdev Extravaganza – October 2014

Пятница, 10 Октября 2014 г. 18:06 + в цитатник

Once a month, web developers from across Mozilla don our VR headsets and connect to our private Minecraft server to work together building giant idols of ourselves for the hoards of cows and pigs we raise to worship as gods. While we build, we talk about the work that we’ve shipped, share the libraries we’re working on, meet new folks, and talk about whatever else is on our minds. It’s the Webdev Extravaganza! The meeting is open to the public; you should stop by!

You can check out the wiki page that we use to organize the meeting, view a recording of the meeting in Air Mozilla, or attempt to decipher the aimless scrawls that are the meeting notes. Or just read on for a summary!

Shipping Celebration

The shipping celebration is for anything we finished and deployed in the past month, whether it be a brand new site, an upgrade to an existing one, or even a release of a library.

Phonebook now Launches Dialer App

lonnen shared the exciting news that the Mozilla internal phonebook now launches the dialer app on your phone when you click phone numbers on a mobile device. He also warned that anyone who has a change they want to make to the phonebook app should let him know before he forgets all that he had to learn to get this change out.

Open-source Citizenship

Here we talk about libraries we’re maintaining and what, if anything, we need help with for them.

django-browserid 0.11 is out

I (Osmose) chimed in to share the news that a new version of django-browserid is out. This version brings local assertion verification, support for offline development, support for Django 1.7, and other small fixes. The release is backwards-compatible with 0.10.1, and users on older versions can use the upgrade guide to get up-to-date. You can check out the release notes for more information.

mozUITour Helper Library for Triggering In-Chrome Tours

agibson shared a wrapper around the mozUITour API, which was used in the Australis marketing pages on mozilla.org to trigger highlights for new features within the Firefox user interface from JavaScript running in the web page. More sites are being added to the whitelist, and more features are being added to the API to open up new opportunities for in-chrome tours.

Parsimonious 0.6 (and 0.6.1) is Out!

ErikRose let us know that a new version of Parsimonious is out. Parsimonious is a parsing library written in pure Python, based on formal Parsing Expression Grammars (PEGs). You write a specification for the language you want to parse in a notation similar to EBNF, and Parsimonious does the rest.

The latest version includes support for custom rules, which let you hook in custom Python code for handling cases that are awkward or impossible to describe using PEGs. It also includes a @rule decorator and some convenience methods on the NodeVisitor class that simplify the common case of single-visitor grammars.

contribute.json Wants More Prettyness

peterbe stopped by to show of the design changes on the contribute.json website. There’s more work to be done; if you’re interested in helping out with contribute.json, let him know!

New Hires / Interns / Volunteers / Contributors

Here we introduce any newcomers to the Webdev group, including new employees, interns, volunteers, or any other form of contributor.

Name IRC Nick Role Project
Cory Price ckprice Web Production Engineer Various

Roundtable

The Roundtable is the home for discussions that don’t fit anywhere else.

Leeroy was Broken for a Bit

lonnen wanted to let people know that Leeroy, a service that triggers Jenkins test runs for projects on Github pull requests, was broken for a bit due to accidental deletion of the VM that was running the app. But it’s fixed now! Probably.

Webdev Module Updates

lonnen also shared some updates that have happened to the Mozilla Websites modules in the Mozilla Module System:

Static Caching and the State of Persona

peterbe raised a question about the cache timeouts on static assets loaded from Persona by implementing sites. In response, I gave a quick overview of the current state of Persona:

  • Along with callahad, djc has been named as co-maintainer, and the two are currently focusing on simplifying the codebase in order to make contribution easier.
  • A commitment to run the servers for Persona for a minimum period of time is currently working it’s way through approval, in order to help ease fears that the Persona service will just disappear.
  • Mozilla still has a paid operations employee who manages the Persona service and makes sure it is up and available. Persona is still accepting pull requests and will review, merge, and deploy them when they come in. Don’t be shy, contribute!

The answer to peterbe’s original question was “make a pull request and they’ll merge and push!”.

Graphviz graphs in Sphinx

ErikRose shared sphinx.ext.graphviz, which allows you to write Graphviz code in your documentation and have visual graphs be generated from the code. DXR uses it to render flowcharts illustrating the structure of a DXR plugin.


Turns out that building giant statues out of TNT was a bad idea. On the bright side, we won’t be running out of pork or beef any time soon.

If you’re interested in web development at Mozilla, or want to attend next month’s Extravaganza, subscribe to the dev-webdev@lists.mozilla.org mailing list to be notified of the next meeting, and maybe send a message introducing yourself. We’d love to meet you!

See you next month!

 

https://blog.mozilla.org/webdev/2014/10/10/webdev-extravaganza-october-2014/


Carsten Book: Mozilla Plugincheck – Its a Community Thing

Пятница, 10 Октября 2014 г. 14:57 + в цитатник

Hi,

A lot of people are using Mozilla’s Plugincheck Page to make sure all the Plugins like Adobe Flash are up-to-date.

Schalk Neethling has create a great Blog Post about Plugincheck here.

So if you are interested in contributing to Plugincheck check out Schalk’s Blogpost!

Thanks!

 

– Tomcat

https://blog.mozilla.org/tomcat/2014/10/10/mozilla-plugincheck-its-a-community-thing/


Carsten Book: The past, Current and future

Пятница, 10 Октября 2014 г. 14:34 + в цитатник

- The past -
I’m now about a year member of the A-Team (Automation and Tools Team) and also Fulltime Sheriff.

It was the end of a lot of changes personal and job wise. I moved from the Alps to the Munich Area to the City of Freising and (and NO i was not at the Beer Oktoberfest ;) and working as Fulltime Sheriff after my QA/Partner Build and Releng Duties.

Its awesome to be part of the Sheriff Team and was also awesome to get so much help from the Team like from Ed, Wes and Ryan to get started.

At one time i took over the Sheriff-Duty for my European Timezone and it was quite challenging having the responsible for all the Code Trees with Backouts etc and also later checkin-neededs :) What i really like as Sheriff is the work across the different Divisions at Mozilla and its exciting to work as Mozilla Sheriff too :)

One of main contribution beside getting started was helping creating some How-To Articles at https://wiki.mozilla.org/Sheriffing/How:To . I hope that will help also others to get involved into sheriffing.

- Current -

We just switched over to use Treeherder as new tool for Sheriffing.

Its quite new and so it feels (as everything like a new car there are this questions like how i do things i used to do in my old car etc) and there are still some bugs and things we can improve but we will get there. Also its a ideal time for getting involved into sheriffing like with hammering on Treeherder.

and that leads into ….:)

- The Future is You – !

As every Open Source Project Mozilla is heavily depended on Community Members like you and there is even the oppourtunity beside all the other Areas at Mozilla to work as Community Sheriff.So let us know if you want to be involved as Community-Sheriff. You are always welcome. You can find us in the #ateam Channel on irc.mozilla.org

For myself i’m planning to work beside my other tasks more in Community Building like blogging about Sheriffing and also taking more part in the Open Source Meetings in Munich.

– Tomcat

https://blog.mozilla.org/tomcat/2014/10/10/the-past-current-and-future/


Mozilla Reps Community: Council Elections – Campaing and candidates

Пятница, 10 Октября 2014 г. 13:59 + в цитатник

We’re excited to announce that we have 7 candidates for the forth cycle
of the Council elections, scheduled for October 18th. The Council has
carefully reviewed the candidates and agrees that they are all
extremely strong candidates to represent the Mozilla Reps program and
the interests of Reps.

The candidates are:

Now, it is up to Reps to elect the candidates to fill the four available
seats for a 12-month term
.

As detailed in the wiki, we are now entering the “campaign” phase
of this election cycle. This means that for the next 7 days,
candidates will all have an opportunity to communicate their agenda,
plans, achievements as a Rep/Mentor, and personal strengths, to the
Mozilla Reps voting body. They are encouraged to use their personal
Mozilla Reps profile page, their personal website/blog, Mozilla wiki
page or any other channel that they see fit to post information
regarding your candidacy.

To guide them in this effort, the Council has prepared 6 questions
that each candidate is asked to answer. We had originally wanted to
have candidates go through mozmoderator but due lack of time we will
do this next election cycle. The questions are the following:

  • What are the top three issues that you would want the Council to address were you to join the Council?
  • What is in your view the Mozilla Reps program’s biggest strength and weakness?
  • Identify something that is currently not working well in the Mozilla Reps program and which you think could be easy to fix?
  • What past achievement as a Rep or Mentor are you most proud of?
  • What are the specific qualities and skills that you have that you think will help you be an effective Council member?
  • As a Mentor, what do you do to try to encourage your inactive Mentees to be active again?

In the spirit of innovation and to help bring a human face to the
election process, the Council would like to add a new element to the
campaign: video. This video is optional, but we strongly encourage
candidates to create one.

That’s it for now. As always, if you have any questions, please don’t
hesitate to ask the Council at

reps-council at mozilla dot com

We’ll be giving regular election updates throughout these next two weeks
so stay tuned!

And remember, campaigning ends and voting starts on October 18th!

Comments on discourse.

https://blog.mozilla.org/mozillareps/2014/10/10/council-elections-campaing-and-candidates/


Daniel Stenberg: internal timers and timeouts of libcurl

Пятница, 10 Октября 2014 г. 10:29 + в цитатник

wall clockBear with me. It is time to take a deep dive into the libcurl internals and see how it handles timeouts and timers. This is meant as useful information to libcurl users but even more as insights for people who’d like to fiddle with libcurl internals and work on its source code and architecture.

socket activity or timeout

Everything internally in libcurl is using the multi, asynchronous, interface. We avoid blocking calls as far as we can. This means that libcurl always either waits for activity on a socket/file descriptor or for the time to come to do something. If there’s no socket activity and no timeout, there’s nothing to do and it just returns back out.

It is important to remember here that the API for libcurl doesn’t force the user to call it again within or at the specific time and it also allows users to call it again “too soon” if they like. Some users will even busy-loop like crazy and keep hammering the API like a machine-gun and we must deal with that. So, the timeouts are mostly to be considered advisory.

many timeouts

A single transfer can have multiple timeouts. For example one maximum time for the entire transfer, one for the connection phase and perhaps even more timers that handle for example speed caps (that makes libcurl not transfer data faster than a set limit) or detecting transfers speeds below a certain threshold within a given time period.

A single transfer is done with a single easy handle, which holds a list of all its timeouts in a sorted list. It allows libcurl to return a single time left until the nearest timeout expires without having to bother with the remainder of the timeouts (yet).

Curl_expire()

… is the internal function to set a timeout to expire a certain number of milliseconds into the future. It adds a timeout entry to the list of timeouts. Expiring a timeout just means that it’ll signal the application to call libcurl again. Internally we don’t have any identifiers to the timeouts, they’re just a time in the future we ask to be called again at. If the code needs that specific time to really have passed before doing something, the code needs to make sure the time has elapsed.

Curl_expire_latest()

A newcomer in the timeout team. I figured out we need this function since if we are in a state where we need to be called no later than a certain specific future time this is useful. It will not add a new timeout entry in the timeout list in case there’s a timeout that expires earlier than the specified time limit.

This function is useful for example when there’s a state in libcurl that varies over time but has no specific time limit to check for. Like transfer speed limits and the like. If Curl_expire() is used in this situation instead of Curl_expire_latest() it would mean adding a new timeout entry every time, and for the busy-loop API usage cases it could mean adding an excessive amount of timeout entries. (And there was a scary bug reported that got “tens of thousands of entries” which motivated this function to get added.)

timeout removals

We don’t remove timeouts from the list until they expire. Like for example if we have a condition that is timing dependent, then we set a timeout with Curl_expire() and we know we should be called again at the end of that time.

If we wouldn’t add the timeout and there’s no socket activity on the socket then we may not be called again – ever.

When an internal state transition into something else and we therefore don’t need a previously set timeout anymore, we have no handle or identifier to the timeout so it cannot be removed. It will instead lead to us getting called again when the timeout triggers even though we didn’t really need it any longer. As we’re having an API that allows this anyway, this is already handled by the logic and getting called an extra time is usually very cheap and is not considered a problem worth addressing.

Timeouts are removed automatically from the list of timers when they expire. Timeouts that are in passed time are removed from the list and the timers following will then get moved to the front of the queue and be used to calculate how long the single timeout should be next.

The only internal API to remove timeouts that we have removes all timeouts, used when cleaning up a handle.

many easy handles

I’ve mentioned how each easy handle treats their timeouts above. With the multi interface, we can have any amount of easy handles added to a single multi handle. This means one list of timeouts for each easy handle.

To handle many thousands of easy handles added to the same multi handle, all with their own timeout (as each easy handle only show their closest timeout), it builds a splay tree of easy handles sorted on the timeout time. It is a splay tree rather than a sorted list to allow really fast insertions and removals.

As soon as a timeout expires from one of the easy handles and it moves to the next timeout in its list, it means removing one node (easy handle) from the splay tree and inserting it again with the new timeout timer.

http://daniel.haxx.se/blog/2014/10/10/internal-timers-and-timeouts-of-libcurl/


Ben Kero: September ’14 Mercurial Code Sprint

Пятница, 10 Октября 2014 г. 01:15 + в цитатник

A week ago I was fortunate enough to attend the latest code sprint of the Mercurial project. This was my second sprint with this project, and took away quite a bit from the meeting. The attendance of the sprint was around 20 people and took the form of a large group, with smaller groups splitting out intermittently to discuss particular topics. I had seen a few of the attendees before at a previous sprint I attended.

Joining me at the sprint were two of my colleagues Gregory Szorc (gps) and Mike Hommey (glandium). They took part in some of the serious discussions about core bugfixes and features that will help Mozilla scale its use of Mercurial. Impressively, glandium had only been working on the project for mere weeks, but was able to make serious contributions to the bundle2 format (an upcoming feature of Mercurial). Specifically, we talked to Mercurial developers about some of the difficulties and bugs we’ve encountered with Mozilla’s “try” repository due to the “tens of thousands of heads” and the events that cause a serving request to spin forever.

By trade I’m a sysadmin/DevOps person, but I also do have a coder hat that I don from time to time. Still though, the sprint was full of serious coders who seemingly worked on Mercurial full-time. There were attendees who had big named employers, some of whom would probably prefer that I didn’t reveal their identities here.

Unfortunately due to my lack of familiarity with a lot of the deep-down internals I was unable to contribute to some of the discussions. It was primarily a learning experience for me for both the process which direction-driving decisions are made for the project (mpm’s BDFL status) and all of the considerations that go into choosing a particular method to implement an idea.

That’s not to say I was entirely useless. My knowledge of systems and package management meant I was able to collaborate with another developer (kiilerix) to improve the Docker package building support, including preliminary work for building (un)official Debian packages for the first time.

I also learned about some infrequently used features or tips about Mercurial. For example, folks who come from a background of using git often complain about Mercurial’s lack of interactive rebase functionality. The “histedit” extension provides this feature. Much like many other features of Mercurial, this is technically “in core”, but not enabled by default. Adding a line in the “[extensions]” section your “hgrc” file such as “histedit =” enables it. It allows all the expected picking, folding, dropping, editing, or modifying commit messages.

Changeset evolution is another feature that’s been coming for a long time. It enables developers to safely modify history and be able to propagate those changes to any down/upstream clones. It’s still disabled by default, but is available as an extension. Gregory Szorc, a colleague of mine, has written about it before. If you’re curious you can read more about it here.

One of the features I’m most looking forward to is sparse checkouts. Imagine a la Perforce being able to only check out a subtree or subtrees of a repository using ‘–include subdir1/’ and –exclude subdir2/’ arguments during cloning/updating. This is what sparse checkouts will allow. Additionally, functionality is being planned to enable saved ‘profiles’ of subdirs for different uses. For instance, specifying the ‘–enable-profile mobile’ argument will allow a saved list of included and excluded items. This seems like a really powerful way of building lightweight build profiles for each different type of build we do. Unfortunately to be properly implemented it is waiting on some other code to be developed such as sharded manifests.

One last thing I’d like to tell you about is an upcoming free software project for Mercurial hosting named Kallithea. It was borne from the liberated code from the RhodeCode project. It is still in its infancy (version 0.1 as of the writing of this post), but has some attractive features for viewing repositories, such visualizations of changelog graphs, diffs, code reviews, a built-in editor, LDAP support, and even a JSON-RPC API for issue tracker integration.

In all I feel it was a valuable experience for me to attend that benefited both the Mercurial project and myself. I was able to lend some of my knowledge about building packages and familiarity with operations of large-scale hgweb serving, and was able to learn a lot about the internals of Mercurial and understand that even the deep core code of the project isn’t very scary.

I’m very thankful for my ability to attend and look forward to attending the next Sprint in the following year.

http://bke.ro/?p=400


Mozilla Open Policy & Advocacy Blog: Spotlight on the ACLU: A Ford-Mozilla Open Web Fellow Host

Четверг, 09 Октября 2014 г. 22:30 + в цитатник

{The Ford-Mozilla Open Web Fellows applications are now open. To shed light on the fellowship, we will be featuring posts from the 2015 Host Organizations. Today’s post comes from Kade Crockford, the Director of the Technology for Liberty program at the ACLU of Massachusetts. We are so excited to have the ACLU as a host organization. It has a rich history of defending civil liberties, and has been on the forefront of defending Edward Snowden following his revelations of the NSA surveillance activities. The Ford-Mozilla Open Web fellow at the ACLU of Massachusetts will have a big impact in protecting Internet freedom.}


Spotlight on the ACLU: A Ford-Mozilla Open Web Fellow Host
By Kade Crockford, Director of Technology for Liberty, ACLU of Massachusetts

Intellectual freedom, the right to criticize the government, and freedom of association are fundamental characteristics of a democratic society. Dragnet surveillance threatens them all. Today, the technologies that provide us access to the world’s knowledge are mostly built to enable a kind of omnipotent tracking human history has never before seen. The law mostly works in favor of the spies and data-hoarders, instead of the people. We are at a critical moment as the digital age unfolds: Will we rebuild and protect an open and free Internet to ensure the possibility of democracy for future generations?

We need your help at the ACLU of Massachusetts to make sure we, as a society, answer that question in the affirmative.

aclu_massachusetts

The ACLU is the oldest civil rights and civil liberties organization in the U.S. It was founded in 1920 in the wake of the imprisonment of anti-World War I activists for distributing anti-war literature, and in the midst of widespread government censorship of materials deemed obscene, radical or insufficiently patriotic. In 1917, the U.S. Congress had passed the Espionage Act, making it a crime to interfere with military recruitment. A blatantly unconstitutional “Sedition Act” followed in 1918, making illegal the printing or utterance of anything “disloyal…scurrilous, or abusive” about the United States government. People like Rose Pastor Stokes were subsequently imprisoned for long terms for innocuous activity such as writing letters to the editor critical of US policy. In 1923, muckraking journalist Upton Sinclair was arrested simply for reading the text of the First Amendment at a union rally. Today, thanks to almost one hundred years of effective activism and impact litigation, people would be shocked if police arrested dissidents for writing antiwar letters to the editor.

But now we face an even greater threat: our primary means of communication, organization, and media—the Internet—is threatened by pervasive, dragnet surveillance. The Internet has opened up the world’s knowledge to anyone with a connection, but it has also put us under the microscope like never before. The stakes couldn’t be higher.

That’s why the ACLU—well versed in the Bill of Rights, constitutional precedent, community organizing, advocacy, and public education—needs your help. If we want to live in an open society, we must roll back corporate and government electronic tracking and monitoring, and pass on a free Internet to our children and theirs. We can’t do it without committed technologists who understand systems and code. Democracy requires participation and agitation; today, it also requires freedom fighters with computer science degrees.

Apply to become a Ford-Mozilla Open Web Fellow at the ACLU of Massachusetts if you want to put your technical skills to work on a nationally-networked team made up of the best lawyers, advocates, and educators. Join us as we work to build a free future. There’s much to be done, and we can’t wait for you to get involved.

After all, Internet freedom can’t protect itself.


Apply to be a 2015 Ford-Mozilla Open Web Fellow. Visit www.mozilla.org/advocacy.

https://blog.mozilla.org/netpolicy/2014/10/09/spotlight-on-the-aclu-a-ford-mozilla-open-web-fellow-host/


Julien Vehent: Automated configuration analysis for Mozilla's TLS guidelines

Четверг, 09 Октября 2014 г. 19:24 + в цитатник

Last week, we updated Mozilla's Server Side TLS guidelines to add a third recommended configurations. Each configuration maps to a target compatibility level:

  1. Old supports Windows XP pre-SP2 with IE6 and IE7. Those clients do not support AES ciphers, and for them we need to maintain a configuration that accepts 3DES, SSLv3 and SHA-1 certificates.
  2. Intermediate is the new default, and supports clients from Firefox 1 until now. Unlike the old configuration, SSLv3, 3DES and SHA-1 are disabled. We also recommend using a Diffie-Hellman parameter of 2048 bits when PFS DHE ciphers are in use (note that java 6 fails with a DH param > 1024 bits, use the old configuration if you need java 6 compatibility).
  3. Modern is what we would really love to enable everywhere, but is not yet supported by enough clients. This configuration only accepts PFS ciphers and TLSv1.1+. Unfortunately, clients older than Firefox 27 will fail to negotiate this configuration, so we reserve it for services that do not need backward compatibility before FF27 (webrtc, sync1.5, ...).

Three recommended configurations means more choice, but also more work to evaluate a given endpoint.To help with the analysis of real-world TLS setups, we rely on cipherscan, a wrapper to openssl s_client that quickly pulls TLS configuration from a target. I wrote the initial version of cipherscan last year, and I'm very happy to see it grow with major contributions from Hubert Kario (Red Hat) and a handful of other people.

Today I'm releasing an extension to cipherscan that evaluates a scan result against our guidelines. By running it against a target, it will tell you what the current configuration level is, and what should be changed to reach the next level.

$ ./analyze.py -t jve.linuxwall.info
jve.linuxwall.info:443 has intermediate tls

Changes needed to match the old level:
* consider enabling SSLv3
* add cipher DES-CBC3-SHA
* use a certificate with sha1WithRSAEncryption signature
* consider enabling OCSP Stapling

Changes needed to match the intermediate level:
* consider enabling OCSP Stapling

Changes needed to match the modern level:
* remove cipher AES128-GCM-SHA256
* remove cipher AES256-GCM-SHA384
* remove cipher AES128-SHA256
* remove cipher AES128-SHA
* remove cipher AES256-SHA256
* remove cipher AES256-SHA
* disable TLSv1
* consider enabling OCSP Stapling

The analysis above evaluates my blog. I'm aiming for intermediate level here, and it appears that I reach it. I could improve further by enabling OCSP Stapling, but that's not a hard requirement.

If I wanted to reach modern compatibility, I would need to remove a few ciphers that are not PFS, disable TLSv1 and, again, enable OCSP Stapling. I would probably want to update my ciphersuite to the one proposed on Server Side TLS #Modern compatibility.

Looking at another site, twitter.com, the script return "bad ssl". This is because twitter still accepts RC4 ciphers, and in the opinion of analyze.py, this is a bad thing to do. We really don't trust RC4 anymore.

$ ./analyze.py -t twitter.com
twitter.com:443 has bad ssl

Changes needed to match the old level:
* remove cipher ECDHE-RSA-RC4-SHA
* remove cipher RC4-SHA
* remove cipher RC4-MD5
* use a certificate with sha1WithRSAEncryption signature
* consider enabling OCSP Stapling

Changes needed to match the intermediate level:
* remove cipher ECDHE-RSA-RC4-SHA
* remove cipher RC4-SHA
* remove cipher RC4-MD5
* remove cipher DES-CBC3-SHA
* disable SSLv3
* consider enabling OCSP Stapling

Changes needed to match the modern level:
* remove cipher ECDHE-RSA-RC4-SHA
* remove cipher AES128-GCM-SHA256
* remove cipher AES128-SHA
* remove cipher RC4-SHA
* remove cipher RC4-MD5
* remove cipher AES256-SHA
* remove cipher DES-CBC3-SHA
* disable TLSv1
* disable SSLv3
* consider enabling OCSP Stapling

The goal of analyze.py is to help operators define a security level for their site, and use this script to verify their configuration. If you want to check compatibility with a target level, you can use the -l flag to specify the level you want:

$ ./analyze.py -t stooge.mozillalabs.com -l modern
stooge.mozillalabs.com:443 has modern tls

Changes needed to match the modern level:
* consider enabling OCSP Stapling

Our guidelines are opinionated, and you could very well disagree with some of the recommendations. The discussion is open on the Talk section of the wiki page, I'm always happy to discuss them, and make them helpful to as many people as possible.

You can get cipherscan and analyze.py from the github repository at https://github.com/jvehent/cipherscan.

https://jve.linuxwall.info/blog/index.php?post/2014/10/09/Automated-configuration-analysis-for-Mozilla-s-TLS-guidelines


Michael Kaply: Firefox 24 ESR EOL

Четверг, 09 Октября 2014 г. 17:45 + в цитатник

I just want to take a moment to remind everyone that the Firefox 24 ESR will be officially replaced by the Firefox 31 ESR this coming Tuesday, October 14, 2014. At that time, the Firefox 24 ESR will be unsupported. Firefox 24 ESR users will be automatically upgraded to the Firefox 31 ESR.

I would hope by now everyone has tested with the Firefox 31 ESR, but if you haven't, it might be time to start.

The CCK2 has been fully updated to work with Firefox 31 and beyond.

On another note, there are major packaging changes coming to Firefox on Mac due to changes to the way applications are signed. You can read more about it in this bug. This will primarily impact the locations of autoconfig files, preferences and the distribution directory. I'll try to find some time soon to document these changes.

http://mike.kaply.com/2014/10/09/firefox-24-esr-eol/


Mozilla Release Management Team: Firefox 33 beta9 to RC

Четверг, 09 Октября 2014 г. 15:58 + в цитатник

  • 13 changesets
  • 30 files changed
  • 215 insertions
  • 157 deletions

ExtensionOccurrences
cpp6
h4
ini2
sh1
nsh1
mm1
html1
hgtags1

ModuleOccurrences
mobile12
content5
browser4
gfx3
widget2
toolkit1
testing1
js1

List of changesets:

JW WangBug 994292 - Call SpecialPowers.pushPermissions() to ensure permission change is completed before continuing the rest of the tests. r=baku, a=test-only - 16bd77984527
Ryan VanderMeulenBug 1025040 - Disable test_single_finger_desktop.py on Windows for frequent failures. a=test-only - 3d1029947008
J. Ryan StinnettBug 989168 - Disable browser_manifest_editor. r=jryans, a=test-only - 7fefb97d2f75
Jim MathiesBug 1068189 - Force disable browser.tabs.remote.autostart in non-nightly builds. r=felipe, a=sledru - 5217e39df54c
Jim MathiesBug 1068189 - Take into account 'layers.offmainthreadcomposition.testing.enabled' settings when disabling remote tabs. r=billm, a=sledru - 7b2887bd78a0
Randell JesupBug 1077274: Clean up tracklists r=jib a=dveditz - f0253d7268bb
Kan-Ru Chen (

http://release.mozilla.org/statistics/33/2014/10/09/fx-33-b9-to-rc.html


Doug Belshaw: Survey: 5 proposals for Web Literacy Map 2.0

Четверг, 09 Октября 2014 г. 13:13 + в цитатник

We’re currently working on a v2.0 of Mozilla’s Web Literacy Map. From the 38 interviews with stakeholders and community members I’ve identified 21 emerging themes for Web Literacy Map 2.0 as well as some ideas for Webmaker. The canonical home for everything relating to this update can now be found on the Mozilla wiki.

Cat with tail

While there are some decisions that need to be made by paid contributors / staff (e.g. design, combining competencies, wording of skills) there are some that should be made by the wider community. I’ve come up with five proposals in this survey:


http://goo.gl/forms/LKNSNrXCnu


The five proposals are:

  1. I believe the Web Literacy Map should explicitly reference the Mozilla manifesto.
  2. I believe the three strands should be renamed ‘Reading’, ‘Writing’ and ‘Participating’.
  3. I believe the Web Literacy Map should look more like a ‘map’.
  4. I believe that concepts such as ‘Mobile’, ‘Identity’, and ‘Protecting’ should be represented as cross-cutting themes in the Web Literacy Map.
  5. I believe a ‘remix’ button should allow me to remix the Web Literacy Map for my community and context.

Please do take the time to fill in the survey. Any meta feedback should go to @dajbelshaw / doug@mozillafoundation.org.

http://literaci.es/weblitmap2-survey


Daniel Stenberg: Coverity scan defect density: 0.00

Четверг, 09 Октября 2014 г. 11:14 + в цитатник

A couple of days ago I decided to stop slacking and grab this long dangling item in my TODO list: run the coverity scan on a recent curl build again.

Among the static analyzers, coverity does in fact stand out as the very best one I can use. We run clang-analyzer against curl every night and it hasn’t report any problems at all in a while. This time I got almost 50 new issues reported by Coverity.

To put it shortly, a little less than half of them were issues done on purpose: for example we got several reports on ignored return codes we really don’t care about and there were several reports on dead code for code that are conditionally built on other platforms than the one I used to do this with.

But there were a whole range of legitimate issues. Nothing really major popped up but a range of tiny flaws that were good to polish away and smooth out. Clearly this is an exercise worth repeating every now and then.

End result

21 new curl commits that mention Coverity. Coverity now says “defect density: 0.00” for curl and libcurl since it doesn’t report any more flaws. (That’s the number of flaws found per thousand lines of source code.)

Want to see?

I can’t seem to make all the issues publicly accessible, but if you do want to check them out in person just click over to the curl project page at coverity and “request more access” and I’ll grant you view access, no questions asked.

http://daniel.haxx.se/blog/2014/10/09/coverity-scan-defect-density-0-00/


Eitan Isaacson

Четверг, 09 Октября 2014 г. 03:13 + в цитатник

An understated feature in desktop Firefox is the option to suppress the text and background colors that content authors choose for us, and instead go with the plain old black on white with a smattering of blue and purple links. In other words, 1994.

Why is this feature great? Because it hands control back to the user and allows people with visual impairments to tweak things just enough to make the web readable.

Somebody once asked on the #accessibility IRC channel why they can’t turn off content colors in Firefox for Android. So it seemed like a good idea to re-introduce that option in the form of an extension. There are a few color related addons in AMO, but I just submitted another one, and you could get it here. This is what the toggle option looks like:

Remove colors option in tools menu

Remove colors option in tools menu

Since the color attribute was introduced, the web has evolved a lot. We really can’t go back to the, naive, monochrome days of the 90s. Many sites use background images and colors in novel ways, and use backgrounds to portray important information. Sometimes disabling page colors will really break things. So once you remove colors from AMO, you get:

AMO with colors removed

Okayish, eh?

As you can see, it isn’t perfect, but it does make the text more readable to some. Having a menu item that doesn’t take too much digging to find will hopfully help folks go back and forth between the two modes and gt the best out of both worlds.


http://blog.monotonous.org/2014/10/08/622/


Peter Bengtsson: Premailer on Python 3

Четверг, 09 Октября 2014 г. 01:00 + в цитатник

Premailer is probably my most successful open source project in recent years. I base that on the fact that 25 different people have committed to it.

Today I merged a monster PR by Michael Jason Smith of OnlineGroups.net.

What it does is basically that it makes premailer work in Python 3, PyPy and Python 2.6. Check out the tox.ini file. Test coverage is still 100%.

If you look at the patch the core of the change is actually surprisingly little. The majority of the "secret sauce" is basically a bunch of import statements which are split by if sys.version_info >= (3, ): and some various minor changes around encoding UTF-8. The rest of the changes are basically test sit-ups.

A really interesting thing that hit us was that the code had assumptions about the order of things. Basically the tests assumed the the order of certain things in the resulting output was predictable even though it was done using a dict. dicts are famously unreliable in terms of the order you get things out and it's meant to be like that and it's a design choice. The reason it worked till now is not only luck but quite amazing.

Anyway, check it out. Now that we have a tox.ini file it should become much easier to run tests which I hope means patches will be better checked as they come in.

http://www.peterbe.com/plog/premailer-on-python-3


Mozilla Open Policy & Advocacy Blog: Launching the Ford-Mozilla Open Web Fellows Program, a Global Initiative to Recruit the Heroes of the Open Internet

Среда, 08 Октября 2014 г. 21:21 + в цитатник

{Re-posting from the Mozilla Blog on Sep 30, 2014}

By Mark Surman, Executive Director, Mozilla Foundation; and Darren Walker, President, Ford Foundation

We are at a critical point in the evolution of the Internet. Despite its emergence as an integral part of modern life, the Internet remains a contested space. Far too often, we see its core ethos – a medium where anyone can make anything and share it with anyone – undermined by forces that wish to make it less free and open. In a world in which the future health of the Internet is vital to democratic discourse and a free flow of ideas, we need a band of dedicated Mozilla Advocacyindividuals standing ready to protect it.

That’s why we are joining together today to launch the Ford-Mozilla Open Web Fellows program, a landmark initiative to create a worldwide community of leaders who will advance and protect the free and open Web.

Working in the Open on Core Issues with the World’s Most Innovative Organizations

Ford-Mozilla Fellows will be immersed in projects that create a better understanding of Internet policy issues among civil society, policy makers, and the broader public. Fellows will be technologists, hackers, and makers who work on a range of Internet policy issues, from privacy and security to surveillance and net neutrality. They will create an affirmative agenda and improve coordination across the sector, boosting the overall number of people throughout society (in nonprofit, government, philanthropy, academic and corporate sectors) that protect the Internet. At present, a whole new architecture is emerging at NGOs and in government where a technology lens is vital to achieving results, just as a focus on law and communications were important in building previous capacity. Fellows will be encouraged to work in the open so that they can share their experiences and learnings with others. Around the world, civil society organizations are working under difficult situations to advance social justice and create a thriving digital society where all voices have an opportunity to be heard.

Fellows will serve as technology advisors, mentors and ambassadors to host organizations, helping to better inform the policy discussion. We are thrilled to name the first cohort organizations that will host a Fellow in the first year of the program. They include:

A Call for Fellowship Applicants

Today also marks the official opening of the application window. Beginning immediately, people can apply to be a Ford-Mozilla Open Web Fellow by visiting www.mozilla.org/advocacy. The application deadline is December 31, 2014.

We are looking for emerging leaders who have a passion for influencing and informing the public policies that impact the Internet. Selected Fellows will have a track record of making and contributing to projects and an interest in working with Mozilla, the Ford Foundation, and our host organizations on specific initiatives to advance and protect the free and open Web.

Protecting the Internet

The Internet has the potential to be the greatest global resource in the history of the world, accessible to and shaped by all people. It has the promise to be the first medium in which anyone can make anything, and share it with anyone. In many ways, it already has helped bend the arc of history towards enlightenment and justice.

But continuing in that direction isn’t guaranteed without help. For all the good that can come from the Internet, in some areas it is already being used to weaken society and concentrate power in the hands of the few, and to shut down democratic discourse. The fight over preserving net neutrality in the U.S.; the debate over governments undermining the Internet to further surveillance efforts; the curtailing of speech and access to the Internet by authoritarian regimes — these are all threats to the Internet and to civil rights.

We need to take up the challenge to prevent this from happening. We must support the heroes – the developers, advocates and people who are fighting to protect and advance the free and open Internet. We must train the next generation of leaders in the promise and pitfalls of technology. We need to build alliances and infrastructure to bridge technology policy and social policy.

The Ford-Mozilla Open Web Fellows program is an effort to find and support the emerging leaders in the fight to protect the free and open Internet. Apply to become a Ford-Mozilla Fellow and tell us how you would stand up to protect and advance the Web to continue the effort to bend the arc toward justice.

https://blog.mozilla.org/netpolicy/2014/10/08/launching-the-ford-mozilla-open-web-fellows-program-a-global-initiative-to-recruit-the-heroes-of-the-open-internet/


Jeff Walden: Holt v. Hobbs: Is a prisoner’s 1/2'' beard so dangerous that he can’t have it even if his religion requires it?

Среда, 08 Октября 2014 г. 20:17 + в цитатник

Now the second, final argument this trip. (There are other arguments this week, some interesting enough to attend. But I ran out of time to prepare for them or attend them.) Holt v. Hobbs is much simpler than Heien v. North Carolina, because one side’s arguments are “almost preposterous”. So this post is (slightly) breezier.

This line was a bit different from the Heien line: more people attending for (this) argument, fewer people present simply for opening day. The line was possibly less talkative (and I still had briefs to read, although I never intended to read all twenty-one [!] of them), but there were still good discussions with local law students, the author of one of the amicus briefs (which I naturally read standing in line), and others. Good fun again.

The line at 05:49 for Holt v. Hobbs
Another day, another line

Gregory Holt and his would-be beard

Gregory Holt is a Muslim inmate in the Arkansas prison system. (He actually goes by Abdul Maalik Muhammad now; Gregory Holt is his birth [legal?] name. News stories and legal discussion refer to him as Holt, and in some sense I want this in that corpus, so I use Holt here.) Holt interprets Islamic law to require he have a beard.

Allah’s Messenger said, “Cut the moustaches short and leave the beard (as it is).”

The Translation of the Meanings of Sahih Al-Bukhari ¶ 5893 (Muhammad Muhsin Khan trans., Darussalam Pubs. 1997)

A small request. Reasonable? Quoting the ever-colorful Justice Scalia in oral argument, “Religious beliefs aren’t reasonable. I mean, religious beliefs are categorical. You know, it’s God tells you. It’s not a matter of being reasonable.” Reasonable or not, a beard isn’t an obviously dangerous request like, “My religion requires I carry a broadsword.” And as a conciliatory gesture Holt moderated his request to a half-inch beard.

Sunrise over the Court, with a camera crew and reporter in the foreground
No matter how many arguments I go to (this makes ten), the sunrise over the Court will never get old

Arkansas: no beards

Arkansas doesn’t permit prisoners to grow beards (except to the natural extent between twice-weekly shaves). There’s an exception for prisoners with medical conditions (typically burn victims), shaving only to 1/4''. But no religious exceptions.

Arkansas’s justifications are three. A beard could hide contraband. A bearded prisoner can shave to disguise himself, hindering rapid identification and perhaps aiding an escape (see The Fugitive). And it’s a hassle measuring half-inch beards on everyone.

The law’s requirements

Twenty-odd years ago, Holt would likely have been out of luck. Turner v. Safley permitted regulations “reasonably related to legitimate penological objectives”. And Justice Scalia’s Employment Division v. Smith says that as a constitutional matter, generally-applicable laws may burden religious exercise, with objectors having no recourse. It’d be an uphill slog getting past the no-beard rule.

But in the mid-1990s to 2000, Congress near-unanimously statutorily protected some exercises of religion, even against generally-applicable laws. (Lest it be thought this was protection specifically, or only, of Christian beliefs: the original motivating case was a Native American group that used a hallucinogen for sacramental purposes.) In particular Congress enacted the Religious Land Use and Institutionalized Persons Act (RLUIPA, usually “ruh-loo-pah”), stating:

No government shall impose a substantial burden on the religious exercise of [a prisoner], even if the burden results from a rule of general applicability, unless the government demonstrates that imposition of the burden on that person—

  1. is in furtherance of a compelling governmental interest; and
  2. is the least restrictive means of furthering that compelling governmental interest

And “religious exercise” is later defined as:

The term “religious exercise” includes any exercise of religion, whether or not compelled by, or central to, a system of religious belief.

Now, prisons may regulate in pursuit of normal prison aims. But regulations can’t “substantial[ly] burden” a prisoner’s “religious exercise”, regardless how important the exercise is(n’t) in the prisoner’s belief system, even if the regulation is general and doesn’t target religion — unless the government demonstrates the regulation satisfies a “compelling interest” that can’t be addressed less restrictively. This phrasing comes from strict scrutiny: the strongest form of review American courts apply to laws. Unlike the Turner/Smith regime, these requirements have teeth.

The oral argument line, extending down the block at 07:19
Almost go-time to advance onto the plaza to receive line numbers

Evaluating Arkansas’s no-beard rule applied to Holt

As a threshold matter, Holt must wish to engage in “religious exercise” that is “substantial[ly] burden[ed]“. Once Holt claims the belief, courts won’t second-guess it. They will consider whether the belief is sincere: no opportunistic exception requests for unwarranted benefits. But no one contests the sincerity of Holt’s beliefs. If Holt refuses to be shaved, he’ll suffer various disciplinary actions and bad consequences: “loss of privileges, punitive segregation, punitive work assignments, and loss of good-time credits”. Certainly a substantial burden.

Now Arkansas must demonstrate — with evidence, persuasively — both a compelling interest, and least restrictive means. Put another way, does Arkansas’s regulation pass strict scrutiny?

Arkansas’s claimed interests are “prison safety and security”. But a no-beards rule only marginally advances these goals, and “the government does not have a compelling interest in each marginal percentage point by which its goals are advanced.” Arkansas’s interest must be more specific: an interest specifically in no beards.

It’s hard to say Arkansas has a compelling interest when the rules in forty-odd prison systems nationwide, and various penal code recommendations, either impose no restrictions on beards among prisoners, or would allow Holt his 1/2'' beard. Arkansas is an outlier. And Arkansas’s medical exemption undermines the argument that no beards must apply universally (compelling interests often brook no exceptions). Similarly, Arkansas can’t use the least restrictive means when forty jurisdictions use even less restrictive means.

Arkansas might justify their policy through unique local experience. But Arkansas concedes “no example” of anyone hiding contraband in a beard. (With the “caveat” that “Just because we haven’t found the example doesn’t mean they aren’t there.” A strong argument!) Disguise arguments could be addressed by taking multiple pictures (as other systems do). And measuring the few inmates requesting religious exemptions wouldn’t be much harder than measuring medical-exception beards.

Arkansas could “demonstrate” strict scrutiny is satisfied by providing evidence of evaluation and reasoned rejection of other states’ policies. But Arkansas previously admitted it considered no other systems (eliciting an acerbic suggestion to try “the common practice of picking up the phone to call other prisons”).

Arkansas could argue that Arkansas’s system, that houses many prisoners in barracks and not separate cells, justifies no beards. But such systems exist elsewhere, and no beards applies in Arkansas’s non-barracks prisons.

In short, Arkansas has demonstrated neither a compelling interest, nor least restrictive means, and it has done so presenting no evidence. Ouch.

In lower courts

An obvious question: why must Holt fight this in court if he’s so obviously right? Basically, a few lower courts are giving far too much deference (a word found in legislative history but not in the statute) to the mere assertions of prison officials, without requiring them to “demonstrate” much of anything. The magistrate judge described officials’ claim that Holt might hide something in his half-inch beard as “almost preposterous” — just before deferring to those claims. Courts below the Supreme Court similarly gave too much deference to prison officials’ bare assertions unsupported by any data.

At the Supreme Court

One indicator of lopsidedness here is the brief count, and authors, on each side. Holt has seventeen other briefs on his side, representing a wide variety of interests: Jewish, Christian, Islamic, Hindu, Sikh, American Indian and Hawaiian, former prison wardens, former corrections officials, Americans United for Separation of Church and State (whose brief, incidentally, is interesting but quite surpassed by later events), sociologists, and the United States government (and others). The authors include a who’s-who of religious freedom organizations. Arkansas has one brief on its side: from eighteen states, who don’t defend Arkansas’s policy as much as try to preserve deference as an element to consider (presumably so those states’ prison systems can be run with a freer hand).

The Court accepted this case in unusual circumstances. Holt filed a hand-written petition requesting Supreme Court review, through a special system not requiring him to pay filing fees from non-existent income. Such petitions are almost never accepted. (Holt basically won the lottery. That said, when I read his brief after the case was accepted, the form was unusual, but the discussion and presentation seemed orthodox.) It’s pretty clear the Court accepted this case to lopsidedly, probably unanimously, overturn the Eighth Circuit. The Supreme Court doesn’t take cases to correct errors, but that’s what they’ll do here.

The #12 admission card
Number 12 today: slipping back slightly, but as far as I’m concerned this means I had perfect timing

Oral argument

The argument questions roughly ran in largely three veins: pondering deference, drawing a line, and almost mocking Arkansas’s arguments. Holt’s counsel faced difficult questions, but not skeptical questions.

Deference

First, what does deference (if it even matters — the term appears only in legislative history, not in the law as enacted) look like in the context of strict scrutiny? These are somewhat contradictions in terms. Yet the Court somehow must make sense of this.

Line-drawing

Second, while beards are easy to decide, other issues (Sikh turbans that actually can conceal things, for example) will require different considerations. How can the Court provide general guidelines to address these situations? The Court doesn’t want to be in the business of reviewing every prison official’s (better-“demonstrated”) decisions. (Scalia bluntly put it this way: “Bear in mind I would not have enacted this statute, but there it is.” Recall he wrote Employment Division v. Smith, shutting off constitutional religious exemptions from generally-applicable laws. Something to remember any time Scalia’s stereotyped as reflexively pro-religion.) But Congress opened up that box, so courts have to live with it.

Almost mocking questions

Arkansas’s position is not easily defended. Not surprisingly, then, questions and comments almost made fun of Arkansas’s position. To the assertion that “Just because we haven’t found the example doesn’t mean they aren’t there”, Justice Breyer replied, “There are a lot of things we’ve never found that might be there and I’ll refrain from mentioning them. You see them on television, a lot of weird programs from time to time.” (Presumably referring to things like Sasquatch, the Loch Ness Monster, Ghost Hunters, and similar.) And later, Justice Alito proposed an alternative means of detecting beard contraband: “Why can’t the prison just…say comb your beard, and if there’s anything in there, if there’s a SIM card in there, or a revolver, or anything else you think ­­can be hidden in a half-inch beard…” (emphases added). Both lines made the audience erupt in laughter.

Post-Holt crowds on the Supreme Court plaza
The post-argument crowds, framed by visitor lines

Why Arkansas fights

It’s unclear to me why Arkansas is still arguing. They won in lower courts. But once the Court agreed to review the in forma pauperis petition, Arkansas should have folded. The law is too clearly against them, and this Court won’t give them a pass. Arkansans should be outraged that their state is wasting taxpayer money to defend this system. (And on the policy’s merits, outraged at the petty bureaucratic nonsense at best, and bigotry at worst, it represents.)

One plausible, potentially upsetting, explanation is provided by former prison wardens: “Political Considerations May Underlie Prison Officials’ Resistance to Accommodations of Religious Practices.” These wardens had been sued (and lost) in various cases cited in briefing, and they candidly admitted that their positions were partly attributable to “political realities”.

Conclusion

Arkansas will lose. The only remaining question is how. (And as before, if I’ve made any mistakes in this discussion, please point them out.)

http://whereswalden.com/2014/10/08/holt-v-hobbs-is-a-prisoners-12-beard-so-dangerous-that-he-cant-have-it-even-if-his-religion-requires-it/


Doug Belshaw: Playtesting for MozFest

Среда, 08 Октября 2014 г. 20:02 + в цитатник

Today I was down at Bishop Grosseteste University, giving a guest lecture and facilitating a workshop. The module was on digital literacies as part of a course for Early Years students. These are students who may go on to teacher training. Some of the work relating to my thesis and the work I’ve done with Mozilla is on their reading list.

Web

From my point of view it was a useful opportunity to playtest some sessions I’ve got planned for the Mozilla Festival at the end of the month. I’ve travelled a lot less in the year since I moved to the Webmaker team, and so I welcomed this opportunity to refine some of my thinking. It’s also good to get input from as many places as possible about Web Literacy Map v2.0.

I made the lecture as participatory as the logic of the lecture theatre allowed. You can find my slides here. We had a backchannel in the form of a Google Doc which surfaced some interesting questions. On a meta level my aim was to highlight the importance of attention. I’m coming round to Howard Rheingold’s view that it’s key to everything we do, both online and offline. Interestingly, one of the questions was whether a single Web Literacy Map can be relevant to everyone.

For the workshop, I split the students into two groups and swapped over halfway. After an introduction to both workshops, half stayed with the course leaders, and the other half came with me. Those who came with me read a chapter of my book The Essential Elements of Digital Literacies followed by a discussion. Those who stayed behind completed a Web Literacy Map activity on their iPads.

Three things stand out in particular from the discussion I had with students:

  1. Confidence. One student had the insight about the reason she’s always shied away from using technology. She explained that she wasn’t exposed to it at a young age in the same way others had been. As a result, she’s always felt under-confident around anything digital and never wants to do more than she has to with it.

  2. Filtering. As I point out in my book, I’m against filtered internet connections. This presupposes the ability to have a rational conversation instead of just filtering. In Early Years (ages 3-5) this isn’t necessarily the case.

  3. Unintended consequences. We know that people devise workarounds to problems they have. Students talked about the ways in which school filters had prevented them accessing Facebook. As a result, they resorted to ‘dodgy’ websites that had evaded filters. These often featured inappropriate advertising and malware, but promised access to Facebook. User accounts were often hacked. By filtering, the school had driven students towards those things they were trying to prevent them doing or seeing.

I’m still waiting to see all the results of the Web Literacy Map activity I set, but the couple of examples I saw were promising. Students added, renamed and re-arranged the competencies of the Web Literacy Map v1.1. This led to some curious groupings. I wouldn’t necessarily thought of putting together ‘Credibility’ with ‘Security’ and ‘Privacy’, for example. It was also interesting that it wasn’t immediately obvious to them what ‘Infrastructure’ means.

For MozFest, I’m going to:

  • Refine the Web Literacy Map activity based on the results of the survey we’re launching this week.
  • Think about where skills and competencies related to ‘e-safety’ should sit.
  • Revisit Beetham & Sharpe’s (2009) taxonomy of access, skills, practices, and attributes.

All in all, it was definitely a worthwhile trip down to Lincoln for me. I hope it was for the students and course leaders, too! Many thanks to Ben Samuels for the invitation, and to Chris Bonfield, Mary-Louise Maynes and team for their warm welcome!

http://literaci.es/playtesting-for-mozfest


Jen Fong-Adwent: End of a year, start of a new year

Среда, 08 Октября 2014 г. 19:00 + в цитатник
A year ago I publicly announced Meatspace as an experiment in realtime chat with animated GIFs

http://ednapiranha.com/2014/end-of-a-year-start-of-a-new-year/


Gervase Markham: Rebel Alliance Ideas

Среда, 08 Октября 2014 г. 15:34 + в цитатник

Chris Beard has been encouraging us to think like the rebels; what can we do that other people won’t do? How can we make an impact?

Here are some of my thoughts:

  • The internet, in global average, is getting less reliable, slower and more laggy. Finish Janus and persuade our mobile partners to deploy it and default to it. Your Firefox OS phone now accesses the net faster than an Android phone.
  • Make Firefox OS connect by default to openwireless.org access points, and encourage Firefox OS users to run them. There’s a virtuous circle here. More net in more places; a global movement of being generous with net access.
  • Finish Daala faster, by finding people other than the core team to do everything except design the codec and write the algorithms (e.g., testing, speed optimizations, fuzzing, writing Windows Media plugins). We need to get the word out that this project is critical.
  • Show the core free software community, who have great influence over tech choices and who should be our natural allies, that we care about them. Be the first organization ever to make a free-from-bottom-to-top mobile phone (running Firefox OS) and give some help to the Replicant team to port to it as well, just to prove we mean it and it’s real.
  • Make it possible to search for specifically open source software in the Marketplace, and show we believe it “promotes the development of the Internet as a public resource” by promoting apps which are open source.
  • Ship Collusion (which has been in the works for years), even if there’s not a perfect mapping between what it shows you and what’s actually bad. Make sites feel they have to justify all their 3rd party links.

What are your ideas?

http://feedproxy.google.com/~r/HackingForChrist/~3/cXcxU_P5oVE/



Поиск сообщений в rss_planet_mozilla
Страницы: 472 ... 85 84 [83] 82 81 ..
.. 1 Календарь