AO3 News

Seven days of love on the AO3!

Published: 2012-02-07 17:31:27 -0500

It's February, and Valentine's Day approaches. We figure this is a great opportunity to spread a little love on the AO3!

Over the next seven days, we'll be posting a new prompt each day to encourage AO3 users to spread the love. Celebrate the wonderful work of fandom and share the joy!

We kick off with a simple way to shower people with love: your task for today is to leave kudos on seven works. New works you've just discovered, old ones you unaccountably missed - let their creators know you loved their work! Maybe you read the work when it was posted on another site and you haven't given the AO3 incarnation any love yet, or maybe you downloaded it and never got around to coming back to leave some love. Maybe it's a hidden treasure on the AO3 and hasn't received much attention yet. Seek out those works, old and new, and shower them with hearts!

AO3 kudos image: stylised AO3 made to look like a figure with arms raised in joy, with hearts floating overhead


Update - AO3 Performance (now with balancing unicorns!)

Published: 2012-02-07 07:52:16 -0500

As we hope users have noticed, the recent performance issues on the AO3 have been alleviated! \0/ The fix came courtesy of our lovely Systems team, who tweaked our server settings to add more unicorns! Unicorns are the workers which actually make the requests of the servers - if there are too many, then things slow down because the servers are overloaded, but if there are too few, then things slow down because we are refusing requests we could potentially serve. (It's a whole unicorn balancing act!)

We weren't optimistic about how much difference changing our unicorn settings would make, because we know we have some underlying application issues which can bog down the database. So, we're really pleased that Sidra was able to find a new balance for the unicorns which has produced noticeable improvements in performance.

Meanwhile, behind the scenes, many of our other fine staff have been working on the performance issues. Sysadmin James_ has been working on partitioning the database: data is stored in tables, and in the new version of MySQL we're using, there's a way to split these tables into parts without affecting anything else. This means that you can search within a much smaller table rather than the whole big block of data, which makes everything faster. With the help of Systems staff, our coders have been going through the slow query database, looking for places where the code runs slowly and needs rewriting to be more efficient. James_ has also been looking into caching options for us - we'd like to use Squid or something similar (the unicorns need some tentacly friends), but we have a lot of different options which affect exactly what is displayed on the page, so we need to figure out the best ways of dealing with those.

Senior coder Elz is also continuing her work on rewriting our browsing filters, which are one of the biggest problem areas in terms of performance - she hopes to have the first version of this ready for testing soon.

We're really happy that the unicorns have helped us address the immediate issue, but we know this doesn't get us off the hook - we still have lots of work to do! While much of this work goes on behind the scenes, we wanted our users to have a glimpse of what's happening when we say 'we're working on it'. :)

We'd like to say a big thank you to all the coders and sysadmins who've been working hard on dealing with these issues. We'd also like to say thank you to all the users who have sent us messages of support and encouragement - we really appreciate it and it's a big boost when we're struggling with some tricky issues. Thanks for your patience while we continue to improve the site. <3


Release notes - release

Published: 2012-01-24 04:32:15 -0500

Welcome to Release Elz, Enigel, Jenny S-T, Naomi, Firewolf, Rebecca and sarken contributed code to this release, which was tested by our awesome testing team: Jenny S-T, Jenn Calaelen, Kylie, Lucy, Tai, XParrot, and Zebra.

This release fixes a few bugs, but also rolls out some exciting new features! We've been working on this stuff for a good while, so we're excited to be able to get it out in the world! Now that we have deployed this code, the next release will be dedicated to performance issues: we'll be trying to keep performance fixes separate from anything else so we can better monitor what's working and what's not.


Fandom feeds!

We know that lots of you want a more convenient way of keeping track of new works posted in your fandoms, so we're pleased to announce the launch of feeds for fandom tags! If you go to a fandom tag - for example Naruto, you'll see an option to 'Subscribe to the feed'. You can add the feed to any feed reader, or syndicate it on Dreamwidth, Livejournal, etc, and you'll be notified of new works in that fandom. \0/

The feeds are only visible on 'canonical' tags - the standardised tags which are marked for use in filters and which display on the fandom pages. This ensures you get everything tagged for that fandom, because all the variant versions of a fandom tag are connected to the canonical - for example, Supernatural also gets works marked SPN. However, if something changes and the tag you have subscribed to is no longer marked as canonical, your feed will still work.

If you subscribe to a 'metatag' which includes a lot of fandoms, you'll get updates for all those fandoms - for example, subscribing to Marvel will give you a feed which includes Captain America (2011), X-Men First Class (2011), Marvel 616, Iron Man (Movies), and all the other fandom tags that wranglers have linked to the metatag.

Right now, feeds don't show restricted works (we may introduce authenticated feeds sometime down the road). They also list works by creation date, not update date - this means that updated WIPs won't get bumped back to the top of the feed. We'll be working on improvements going forward - we welcome feedback.

We hope this makes it easier for you to keep track of ALL the works you're interested in!

Subscriptions to individual works

You've been asking for this for a long time, so we're really excited to be able to offer subscriptions to individual works! You can now choose to subscribe to a work-in-progress and get notified by email when a new chapter is added. \0/ HURRAY!

Importing features for Open Doors

One of the key aims of the Archive of Our Own is the preservation of fannish history. As part of this, we want mods to be able to import archives they maintain elsewhere, so they can rescue at-risk archives. We've now finished the code which will allow this! \0/ Working with the Open Doors committee, we'll be importing the Smallville Slash Archive as our first test case in the next month or so. For more on Open Doors and the archive importing feature, check out the Open Doors FAQ. Stay tuned for more news!

Improvements to our HTML parser

Those of you who have been with us for a while will know that improving our HTML parser, which cleans up your HTML when you post a work, has been an ongoing project. In this release, coder Rebecca has included a bunch of improvements, which include fixes for some of the most annoying bugs - no more br tags insinuating themselves into your code where they aren't wanted! HTML parsing is a very, very tricky area, and although we've tested as thoroughly as possible we can't test every possible use case. So, if you find that it does something undesirable to your code, please get in touch with Support.

Known Issues

If you've been using the site at all recently, you've probably noticed that we've been hit by a lot of performance issues. We've written a bit about what's going on and what we're doing to address it in our recent post on performance. The good news is that the new subscriptions options should actually help a bit - a lot of the weight on the site is people hitting index pages to see if anything has been posted in their fandom and/or if the WIP they're following has been updated. Going directly to a work is much less work for our servers, so giving people new ways to track what they want to read is likely to ease the performance issues rather than exacerbating them. The performance issues are our highest priority right now, so we'll be working on them for the next wee while.

We've had a few reports of email notifications not getting through. It looks like some ISPs may be marking some of our emails as spam: we're looking into the issues, but adding to your email contacts list will help ensure our messages are treated as legitimate.

See our Known Issues page for other current issues.

Release Details


  • Added an option to subscribe to individual works - you can now be notified when someone uploads a new chapter to a WIP! \0/
  • Added feeds for all fandom tags! \0/
  • Removed 'Series:' information from works which aren't part of a series when displaying feeds.
  • Added chapter count to 'Share' option and subscriptions information.
  • Added option to add bookmarks to collections.
  • Added new code for importing archives: people with the archivist role can now import works for other users, and the owners of the works will be notified by email
  • Added a comments field to icons, so people can give credit and other information

Bug fixes

  • Fixed importing from Livejournal (formatting was broken by the recent changes on their end)
  • Ensured epub mimetype headers are no longer compressed - should fix a bug causing Internet Explorer to download epubs as zip format, and should allow mobile browsers to recognise the content correctly.
  • Made database index for user accounts unique to prevent duplicate accounts
  • Fixed some outstanding bugs with our banner notices, so that they can now be dismissed by users
  • Parser and sanitizer
    • Fixed the issue where the parser would add an unwanted <br /> tag after various HTML tags
    • Fixed the problem where if there were some paragraph tags in the text, the parser wouldn't correctly add more when you edited.
    • Improved the parser's ability to handle mixed uppercase & lowercase HTML
    • Parser now respects blank spacing inside pre tags
    • Made it possible to assign more than one class to an HTML tag on a work
  • Tags and tag wrangling
    • Fixed bug where wranglers were unable to create tags with accented characters if a tag with the same name in unaccented characters already existed
    • Fixed a bug where tag hierarchical trees on wrangler pages did not display properly
  • Front end fixes
    • Made link styles in #footer more consistently defined
    • Fixed issue causing buttons to display weirdly on chapter reorder pages
    • Fixed bug causing filters to overlap inbox contents on mobile skin
    • Fixed display issues on the static version of fandom pages
    • Added some spacing on the static version of works pages so that kudos buttons aren't too close to text
    • Changed styling of :focus to make it more consistent and visible
    • Fixed header icon display so that it doesn't overlap post new/login buttons in Chrome and Safari
    • Input and h3 on reorder series page both set to full width


OTW action on SOPA/PIPA

Published: 2012-01-17 18:22:09 -0500

The internet has been abuzz recently with comments about the 'Stop Online Piracy Act' (SOPA) currently under debate in the US House of Representatives, and its counterpart the 'Protect IP Act' (PIPA) in the Senate. Organizations such as the EFF and the Library Copyright Alliance have raised concerns that the bills - which are ostensibly aimed at curbing 'rogue' foreign sites - have significant implications for the web internationally, and will work to curb free speech and online creativity.

Here at the OTW, we've been following developments since the bill was first mooted. SOPA has particular implications for sites which include user-generated content because of the broad language in the bill. This means that it has the potential to negatively affect many popular fansites - including the Archive of Our Own and Fanlore - if it is implemented in its current form.

Following protests from many groups, the Obama administration issued a statement which was seen by the New York Times as a significant blow to the proposed legislation. Nevertheless, the EFF argues that it still poses a significant threat.

In order to make sure that members of the US Senate and House of Representatives understand the problematic nature of the proposed legislation, many sites around the internet are taking part in an 'internet strike'. The OTW will be joining this day of action with a banner on the Archive of Our Own and a blackout on our main website, If you are a US citizen, we urge you to contact your representatives and senators to let them know how you feel about these bills.


Performance issues on the AO3

Published: 2012-01-17 17:05:30 -0500

As many users will no doubt have noticed, the AO3 has been experiencing some performance issues since the start of the year. When we posted on 5th January, we were expecting those problems to ease once the holiday rush was over. However, that hasn't turned out to be the case. We're working on ways of dealing with the performance issues, but we wanted to keep you updated with what's going on while we do that.

Why the slowdowns?

In the past month, over 2000 new users have created accounts on the Archive. At the same time, the number of people reading on the Archive - with or without accounts - has been steadily growing. This has been part of a general trend, as you can see if you look at the graph showing number of visits to the Archive since November:

Line graph showing number of visits to the AO3, November to January. The line gradually goes up (with spikes on Sundays) before peaking dramatically on Jan 2.

We're always much busier on Sundays, but the number of visits has been gradually going up each week since November (and the same holds true for the preceding months). However, before December we were hovering around the 135,000 level for visitor numbers at peak times. You can see that the visitor numbers began to climb more dramatically in December, peaking on 2nd January when we had 182,958 visitors. Crucially, after that spike it didn't drop back down to anything like the levels it had been at previously: we're now at more than 150,000 visits on a regular day, and more than 165,000 on Sundays, our busiest day. Wow!

We were expecting a big spike over the holidays, when there are lots of challenges and lots of people with a little spare time for reading and creating. However, we hadn't expected site usage to remain quite so high after the holidays were over! The increases mean that the site is now under a holiday load every day, which is one reason things have been running a little slowly.

The other reason for the slowdowns is that the increase in our number of registered users, and the holiday challenge season, has produced a big increase in the number of works. In fact, 11,516 new works have been posted since the end of December already! More data in our databases means more work for things like sorting, searching, etc - this means that sometimes the database just doesn't serve up the result you need in time, and the unicorn which is waiting to get that result gives up and goes away (yes, really - our servers are assisted by unicorns :D).

We've been expecting this general effect for a while now, and we've been working towards implementing things to deal with it; however, we weren't expecting quite such a big jump in site usage in the past month!

What are you doing about this?

The Accessibility, Design & Technology and Systems Committees had a special meeting on Saturday to discuss ways of dealing with the immediate problem, as well as longer term plans. It can be tricky to test for high load situations before they actually occur, but once they do occur there's lots of data we can gather to help us address the most crucial issues. (We're also working on implementing more tools which will help us test this stuff before it comes up.)

Short term

More caching: We already cache pages (or sections of pages) across the site - this means we store a copy which we can serve up directly, instead of creating the page every time someone wants to use it. If something changes, then the cache is expired and a new, updated copy is created. Hitherto, we've focused on caching chunks of information which are unlikely to change rapidly: for example, on any works index the 'blurbs' which show the information about each work are cached. However, some of the heaviest load is caused by rapidly changing pages like the works index. We're moving towards more caching of whole pages, so that a new copy of the works index (for example) will be created every five minutes rather than generated each time someone asks for it. This means things like works indexes will be a little slower to update - when you add a new work, it won't appear on the list until the cache expires - but that five minute delay will massively reduce the weight on our servers.

More indexes: We have a few places in our databases - for example the tables for the skins - which could use more indexes. Indexes speed things up because the server can just search through those rather than the whole table. So, we're hunting out places where more indexes are needed, and implementing them. :)

Medium term

Bad queries must die: We have a few queries which are very long and complicated, and take a long time to run. We need to rewrite these bits of the code to make them simpler and faster! In many cases this will be quite complicated (or else we would have done it already), but it's a priority to help us speed things up.

New filters for great justice: The filters that are implemented on our index pages are not really optimal considering the size of the site now - the limitations of that code are the reason we have to have a 1000 work cap on the number of works returned. We have been working on this for a long time - we need to completely throw out what we have and implement a system which works better for the site as it is now. Again, this is really complicated, which is why it's taken us a long time to achieve it even though we knew it was important - the good news is that we have now done quite a lot of work on this area and the first round of changes should be out in the next few months.

Long term

Long term, we're going to be moving to a setup which allows us to distribute our site load across more servers. This will involve database sharding - putting different bits of the database on separate servers - so it will take quite a lot of planning and expertise. If you're a user of Livejournal or Dreamwidth, you might be aware that your journal is hosted on a certain 'cluster' - we'd be moving to a similar system. We want to make sure we do this right, but based on the way the site is growing we think this is now high priority, and our Systems team are working to figure out the right ways forward.


We know it's really frustrating when the site runs slow or is timing out on you: many apologies. We really appreciate users' patience while we deal with the issues. As you'll see from the above, there are some immediate things we can do to ease the problems, and we also have a good sense of where we need to go from here. So, while these changes need to be implemented as a matter of urgency, we feel confident we will be able to tackle the problems. If you have expertise in the areas of performance, scalability and database management, we would very much welcome additional volunteers.

As we move forward on dealing with problem spots on the site, we may implement some changes which are visible to users: the caching on the index pages and the changes to browsing and searching are two of the most obvious. We'll let you know about this as we go along - we think the effect will be beneficial for everyone, but do be prepared for a few changes! You can keep up with status and deploy news on our Twitter @AO3_Status.

While the growth in the site means we're facing some problems a little sooner than we expected, we're really excited about the fact so many people want to read and post to the AO3. Thanks to everyone for your fannish energy - and apologies for the fact we sometimes slow you down a little.


Rush hour on the AO3!

Published: 2012-01-05 09:59:30 -0500

As many of you will have noticed, we had some site slowdowns and 502 errors over the first couple of days of the year. Apologies for the inconvenience! If you've run into this problem and been wondering what was going on, you might be interested in this:

Line graph of the last visits on the AO3, 4 December to 3 January. The graph peaks sharply on 1st January

Yes, it looks like lots of fans decided to celebrate the New Year with some delicious fanworks. On Monday 2nd January we had 182,958 visits, and over 1,066,216 pageviews! Furthermore, an octopus swam off with our servers - volta_arovet's Texts From Cephalopods has had 46,301 hits at the time of writing! So, our servers had plenty of work to do!

Over 2000 new users have joined the Archive in the last couple of weeks, and we have hosted several great challenges, including Yuletide (2598 works!), Due South Secret Santa (a more modest 34 works), and Homestuck Ladyfest Exchange (124 works). So, while we're sorry to have had some slowdowns, overall we are super pleased with how well our shiny new servers have held up - those of you who were with us during the holiday season in previous years will remember that the high traffic of holiday challenges made our old servers very sad.

Looking forward, we're not too worried about performance in the immediate future - there are some code improvements we know we need to make which will improve matters a lot, so those will be high priority. If the AO3 continues to expand at the same rate as this year, we will be looking at more servers sooner rather than later. But in light of the graph above, we're pleased that while we certainly slowed down, we didn't grind to a halt! Thanks to all the coders and sysadmins who did the work to make this possible, and thank you to all the OTW members whose donations helped us buy those hardworking servers (we are always grateful for volunteers or donations)! And, of course, thanks to everyone who reads and posts on the AO3 - we're excited to welcome so many of you!

Once again, apologies to those of you who have been affected by the slowdowns - but hurray for so much beautiful fannish activity!


2011 Year in Review!

Published: 2011-12-31 14:07:42 -0500

2011 was an amazing year for the Archive of Our Own, and we wanted to take a moment to look back and to thank everyone involved, including all of our users and volunteers! AO3 started its open beta about two years ago, towards the end of 2009. That year, we were really still putting the pieces together, building out the core functionality. In 2010, we started to pick up more momentum with people posting their works and archiving their older fic and art. We added gift exchange challenge hosting, kudos, downloads and skins. This year, we've done a lot of work on site performance and infrastructure, usability improvements, and new features like subscriptions and prompt meme challenges. We're looking forward to expanding on that next year and continuing to build a great, stable home for all kinds of fanworks!

Traffic and performance

A drawing of our seven machines!

At the beginning of the year, we moved to a new and bigger set of servers, which gave the site some much-needed room to grow. Our systems team made some tweaks along the way, ensuring that we were getting the best performance out of the new setup. We started using Redis, which is super-fast, for email queues, autocompletes and other background tasks, which took some of the load off our main database. And even with all the work we were doing, it was tough to keep up with how fast the site was growing! 2/3rds of our current registered users signed up this year, and we kept giving out more and more invitations through our invite queue, but the numbers kept climbing - there were over 2,000 people on the waiting list for several months. (We've finally gotten that down now, just by sending out even more as the system could handle it.) And many more site visitors aren't registered users - we now get well over half a million unique visitors each month and there have been 24+ million pageviews in December. We now get as much traffic on an average day as we did last year during Yuletide, which at the time was a huge traffic spike. The period around Christmas, with Yuletide and other holiday exchanges going live, still represents a noticeable jump in traffic, but the difference isn't as great which means more stable site performance. (\o/ We were standing by with fingers crossed just in case, but we were thrilled that no last-minute work was required this year!)

Fun with charts!

AO3 is currently home to over 8,100 fandoms, 31,000 users and 275,000 works! Here's a graph of work, chapter, bookmark and comment posting over the last three years:

You can see that work posting is up this year, but what's much more dramatic is the increase in reading, bookmarking and commenting. There have also been more multi-chapter works and works-in-progress posted this year, which is exciting. And one of the neat features the archive has is the ability to go back and see what you've read or viewed, for registered users who have it enabled. Here's how that looks year-to-year:

Lots of people viewing lots of stuff! There have also been almost 1.5 million kudos left since last year, so there's been no shortage of love to go around. <3

What's on deck for 2012

In the short term, we have a new release coming out hopefully early this month, and that will include improvements to our HTML parser (yay!), some exciting new subscription options and a variety of bugfixes. There are also a ton of other features and improvements that we've been developing this year that we hope to have ready for you in 2012, including the ability to view the site in other languages, art hosting, an on-site support area and a wealth of browsing, filtering and email improvements for both works and bookmarks. We also hope to start a series of international fandom spotlights in January and solicit more input from users about upcoming features.


And finally: thank you! Thanks to all of the authors, artists, and vidders who have posted their works, to the mods who organize challenges and collections, to those who have shared skins for customizing the site, to everyone who creates bookmarks and leaves comments and kudos, encouraging authors and artists and making it easier for other fans to find awesome works. Thanks to everyone for bearing with our growing pains earlier in the year and for supporting AO3 financially, enabling it to continue operating and improving. And many thanks, as always, to everyone who volunteers their time wrangling tags, writing code, testing the site, handling support requests, and maintaining our systems, and also to everyone who's left comments and written in to our support team with feedback, suggestions, and bug reports, all of which are incredibly valuable! The archive is very much a community effort, and it couldn't exist without all of us working together and supporting one another.



The Accessibility, Design and Technology committee oversees technology-related projects within the OTW. Currently we are responsible for designing and building the Archive of Our Own. Our regular meeting updates keep you informed about developments on the AO3!

This was our final meeting of the year: the OTW takes an end-of-year break during which committees dissolve and are reformed, and committee members take a well-earned rest! We've had an action-packed year, so we're all ready for a break (from meetings at least - a lot of our work goes on as usual). We'll resume in January - we don't take on any new volunteers during our hiatus, so if you volunteer between now and then (and we hope you will - as you can see below, there are several areas we're really keen to build up), you'll have to wait a little while to get started.

Meeting highlights!

Fandom landing pad!

AD&T co-chair and senior coder Elz has been working for a while on improvements to browsing on the Archive. One thing she's been working on is a new 'fandom landing pad' so that browsing to a fandom will give you the option to browse to some important areas relating to that fandom. In this meeting we previewed her new design - going to a fandom landing page will give you a list of pairings and relationships in the fandom, and a list of authors and artists who have created work in that fandom, along with some basic information about the canon source. It's not quite ready for primetime yet, but it's looking very nifty!

Issues for Love!

Issues for love are the requests submitted by users via Support. We try to work through a few of these each meeting: we're working on ways of making it easier for people to see what has been suggested and what has been decided about the suggestions, but for now we'll include a round-up of our discussions in our meeting updates. Note that a decision to implement something does not necessarily mean it will be implemented soon - we have many issues to work on and a limited number of coders! If you want to see the full (and lengthy!) list of things logged for coders to work on you can check out our Google Code Issues. If you'd like to adopt an issue, we welcome new coders!

  • Request to add a setting to prompt-meme challenges to disallow anonymous prompts. This seemed like a handy extra feature without too much coding complexity, so we have logged it as an issue for a coder to work on.
  • Request to add an option to hide 'Share' buttons on works to reclaim screen real estate. It's already possible to disallow use of the share button on your own works, but you still see the button. We sympathise with the desire to reclaim the screen real estate, but we decided that added a user preference to hide the buttons would add too much complexity (the more user preferences there are, the more complicated it becomes for people to figure out what they can set in preferences, so we try to limit the options to things where there is a lot of demand for a setting). Instead, we added some extra code to our buttons so that they can be selected with CSS, so that people can build skins which hide the 'Share' button (or indeed any other button).
  • Suggestion for a 'challenge calendar' listing opening and closing dates for challenge sign-ups, and dates for assignments due, works revealed, authors revealed, etc, which can be opted in when a mod creates the challenge. We loved this idea, but it is fairly complex to implement. Our lovely co-chair Amelia has volunteered to put together a design, so this is something we'll introduce in the future - but probably not for a while.
  • Request for a way to mark WIPs as abandoned, and a way to offer abandoned WIPs up for 'adoption' so that someone can finish them. We all agreed it would be really nice to have a quick way to flag that a WIP would never be finished, so we've logged that as an issue for a coder to implement. The idea of offering works up for adoption seems like it might have more limited appeal, so we agreed that for now, it would be better to leave this as something which people can simply indicate in the tags they use, if so desired (you can add 'Adopt this story' or indeed any other tag you wish as an additional tag to your work).
  • Reflecting on Release 0.8.9

    As most of you reading this will know, we had a big release of new code at the beginning of November. This release included a lot of exciting new stuff; unfortunately, it didn't go as smoothly as we had hoped. In this meeting we reflected on problematic areas and ways that we can improve in future:

    • We combined two big new features: the redesign of our front-end code and the new tag sets code for challenges and collections. We had decided to combine the two because the tag sets needed some front-end work anyway, and at the time we made the decision it made sense to roll the two things into one. However, the tag sets code was time sensitive: because it offers a new system of challenge nominations which significantly reduces the pressure on tag wranglers, we wanted to implement it in time for the big holiday challenges such as Yuletide. This meant that when we combined the two features, we had a lot more stuff to get ready within a set amount of time, which made everything more difficult. When we decided to merge the two, it didn't seem as if this was going to be a problem - but one thing we've learnt is that any deploy can bring unexpected hitches, so in the future if there's a time-sensitive feature we'll be trying to keep that as separate from other code as possible.
    • This was a big visual change, which meant that it had an impact on a large number of users: visual bugs tend to be encountered by lots of users, and even if there are no bugs, people still have lots of feedback about visual changes. We were aware of this; however, given the scale of the response to this deploy we realise we weren't prepared enough. We'll be doing more testing of interface changes in future, and exploring ways of beta-testing them with more users.
    • Since one thing about visual changes is that lots of people just prefer the design they are used to, one thing we could have improved on was providing a way of going back to the old default design. We tried to provide for this with the One Point Faux option, but it had quite a few problems. So, in future this is something we'll be paying more attention to: if we introduce a big change, we'll try to provide ways of opting out. The good news is that going forward, this will actually be easier, because the new skins system is much more lightweight and it should be easier to provide some backwards compatibility (one reason this was problematic this time is because the underlying code for the old system was less than ideal, so everything had been completely rewritten).
    • We didn't have as much support documentation and information as we really needed for this deploy. In particular, we needed much fuller documentation on the new skins features so that people could try them out more easily and our Support team could point to useful information when helping people. There were several factors which led to a lack of documentation: crucially, several of the team who would normally take care of this were dealing with RL issues which limited the amount of time they could spend on it. In order to help avoid problems like this in future, we're building a deploy checklist which includes documentation, to make sure that we've considered whether we need additional documentation regardless of who is available to work on any given deploy. We're also aiming to build up a proper documentation team so that this work is less likely to fall through the cracks: if you're interested in being involved in this team, get in touch with our Volunteers and Recruitment Committee and let them know. We'd love to welcome new people to the team!
    • We also needed more documentation on the new features for testers, so that it was clearer what people needed to test and what they should expect it to do. This is an ongoing aim - we're working to improve our documentation across the board. Improving documentation for testers will also help us to address another issue, which was that feedback from testers got a bit scattered - having clear docs to start with would have helped us make it clearer what feedback needed to go where. Again, we're working on building up our testing procedures generally - if you're interested in getting involved with testing, let us know!

    While the problems we had with this deploy did highlight a number of areas where we need to work to improve, it's not all doom and gloom! There were also a number of things that went right with this deploy - we were able to fix critical bugs within 48 hours of the deploy, the Support team did a wonderful job keeping up with the many Support requests, and there was a huge amount of awesome code in the deploy itself. One reason the site is still in beta is that we're still learning the best processes for development (as well as because our code is new and rapidly changing): in the last four years we've gone from being a tiny group working on coding a 'closed' site (i.e. for the first two years we were just writing the code and testing, we didn't have any real users) to being a much larger group catering for a site of over 28800 users! So, we're still figuring things out - objects may still shift in transit! We're pleased that we've been able to keep the site up and running, and everything largely functional, even though we've had the odd bump along the way. Thanks to everyone who has worked hard to make this true!

    Next deploy

    We're hoping to get one last deploy in before the end of 2011! It will include some updates to our HTML parser, some improvements to our static pages for collections and challenges, and Atom feeds for fandom tags! (YEAY!)

    News from our sub-committees

    • Coders have been working on polishing off the issues to go in the next deploy. We're particularly excited about the forthcoming addition of Atom feeds for fandom tags - having tested this out for a good while now on the F/F tag, we think we can implement feeds without too much additional strain on the servers, and since this is a very popular request we're excited about launching it!
    • Testers have been testing the issues for next deploy, and discussing how they'd like to see the subcommittee develop next year. There have been some great discussions on what worked and what didn't this year, how we can build a stronger testing community, and how we can support our testers.

    News from our sister committees

    • Support have continued to work amazingly hard keeping up with a high number of tickets. Looking forward, they're also thinking about our documentation needs and places we need more information for users.
    • Tag wranglers have been discussing needs for next year with AD&T - the two committees will be meeting in the new term to talk over technical needs for tag wrangling. They've also been surveying all tag wrangling volunteers about their experiences this year, with a view to figuring out what works well and what can be improved on.

    If there are things you'd like to do or say, please share them in comments, via the AO3 support and feedback form, by volunteering (we won't be taking on new volunteers until the new term, but you can get in touch now to let us know you're interested), or in whatever medium you feel comfortable with. Everyone is welcome to this party!

    This meeting round-up by Lucy


Pages Navigation