AO3 News

Seven Days of Love: Day 3

Published: 2012-02-09 12:32:35 -0500

It's the third day of our February lovefest! So far we've asked you to show your love for the creators of your favourite fanworks by leaving kudos and comments. Now we'd like you to share the love a little more widely with some recs. Today, we'd like to hear recs for old favourites. What are the works you've come back to againand again?

You can leave recs in the comments to this post, create them in your own space elsewhere and link to them from here, or create recs using AO3 bookmarks (just tick the box to say it's a rec). Share your love and tell us about that awesome vid, work or fanart which deserves wider exposure!

Comment

Seven Days of Love: Day 2

Published: 2012-02-08 12:13:33 -0500

The year, in the week running up to Valentine's Day, we're running our Seven Days of Love challenge on the AO3! Yesterday we asked you to show your love for works on the Archive with a shower of kudos - 11564 kudos have been left in the past 24 hours, that's 3 per registered user!

Today, we're asking you to speak your love in a little more detail. Just what is it that makes that piece of fanart so perfect you return to it every day, or that fic so gripping that you stayed up all night to finish it? Why is that one vid so memorable to you? Leave a comment on a work you haven't commented on yet. Your comment could be as simple as letting someone know they made you smile today, or it could be that long-deferred essay you've been mentally composing on why every single frame of their vid is perfect.

As anyone who's ever posted a fanwork knows, a comment can make your day. Let your favourite AO3 creators know they're loved!

Comment

Seven days of love on the AO3!

Published: 2012-02-07 17:31:27 -0500

It's February, and Valentine's Day approaches. We figure this is a great opportunity to spread a little love on the AO3!

Over the next seven days, we'll be posting a new prompt each day to encourage AO3 users to spread the love. Celebrate the wonderful work of fandom and share the joy!

We kick off with a simple way to shower people with love: your task for today is to leave kudos on seven works. New works you've just discovered, old ones you unaccountably missed - let their creators know you loved their work! Maybe you read the work when it was posted on another site and you haven't given the AO3 incarnation any love yet, or maybe you downloaded it and never got around to coming back to leave some love. Maybe it's a hidden treasure on the AO3 and hasn't received much attention yet. Seek out those works, old and new, and shower them with hearts!

AO3 kudos image: stylised AO3 made to look like a figure with arms raised in joy, with hearts floating overhead

Comment

Update - AO3 Performance (now with balancing unicorns!)

Published: 2012-02-07 07:52:16 -0500

As we hope users have noticed, the recent performance issues on the AO3 have been alleviated! \0/ The fix came courtesy of our lovely Systems team, who tweaked our server settings to add more unicorns! Unicorns are the workers which actually make the requests of the servers - if there are too many, then things slow down because the servers are overloaded, but if there are too few, then things slow down because we are refusing requests we could potentially serve. (It's a whole unicorn balancing act!)

We weren't optimistic about how much difference changing our unicorn settings would make, because we know we have some underlying application issues which can bog down the database. So, we're really pleased that Sidra was able to find a new balance for the unicorns which has produced noticeable improvements in performance.

Meanwhile, behind the scenes, many of our other fine staff have been working on the performance issues. Sysadmin James_ has been working on partitioning the database: data is stored in tables, and in the new version of MySQL we're using, there's a way to split these tables into parts without affecting anything else. This means that you can search within a much smaller table rather than the whole big block of data, which makes everything faster. With the help of Systems staff, our coders have been going through the slow query database, looking for places where the code runs slowly and needs rewriting to be more efficient. James_ has also been looking into caching options for us - we'd like to use Squid or something similar (the unicorns need some tentacly friends), but we have a lot of different options which affect exactly what is displayed on the page, so we need to figure out the best ways of dealing with those.

Senior coder Elz is also continuing her work on rewriting our browsing filters, which are one of the biggest problem areas in terms of performance - she hopes to have the first version of this ready for testing soon.

We're really happy that the unicorns have helped us address the immediate issue, but we know this doesn't get us off the hook - we still have lots of work to do! While much of this work goes on behind the scenes, we wanted our users to have a glimpse of what's happening when we say 'we're working on it'. :)

We'd like to say a big thank you to all the coders and sysadmins who've been working hard on dealing with these issues. We'd also like to say thank you to all the users who have sent us messages of support and encouragement - we really appreciate it and it's a big boost when we're struggling with some tricky issues. Thanks for your patience while we continue to improve the site. <3

Comment

Release notes - release 0.8.10.22

Published: 2012-01-24 04:32:15 -0500

Welcome to Release 0.8.10.22. Elz, Enigel, Jenny S-T, Naomi, Firewolf, Rebecca and sarken contributed code to this release, which was tested by our awesome testing team: Jenny S-T, Jenn Calaelen, Kylie, Lucy, Tai, XParrot, and Zebra.

This release fixes a few bugs, but also rolls out some exciting new features! We've been working on this stuff for a good while, so we're excited to be able to get it out in the world! Now that we have deployed this code, the next release will be dedicated to performance issues: we'll be trying to keep performance fixes separate from anything else so we can better monitor what's working and what's not.

Highlights!

Fandom feeds!

We know that lots of you want a more convenient way of keeping track of new works posted in your fandoms, so we're pleased to announce the launch of feeds for fandom tags! If you go to a fandom tag - for example Naruto, you'll see an option to 'Subscribe to the feed'. You can add the feed to any feed reader, or syndicate it on Dreamwidth, Livejournal, etc, and you'll be notified of new works in that fandom. \0/

The feeds are only visible on 'canonical' tags - the standardised tags which are marked for use in filters and which display on the fandom pages. This ensures you get everything tagged for that fandom, because all the variant versions of a fandom tag are connected to the canonical - for example, Supernatural also gets works marked SPN. However, if something changes and the tag you have subscribed to is no longer marked as canonical, your feed will still work.

If you subscribe to a 'metatag' which includes a lot of fandoms, you'll get updates for all those fandoms - for example, subscribing to Marvel will give you a feed which includes Captain America (2011), X-Men First Class (2011), Marvel 616, Iron Man (Movies), and all the other fandom tags that wranglers have linked to the metatag.

Right now, feeds don't show restricted works (we may introduce authenticated feeds sometime down the road). They also list works by creation date, not update date - this means that updated WIPs won't get bumped back to the top of the feed. We'll be working on improvements going forward - we welcome feedback.

We hope this makes it easier for you to keep track of ALL the works you're interested in!

Subscriptions to individual works

You've been asking for this for a long time, so we're really excited to be able to offer subscriptions to individual works! You can now choose to subscribe to a work-in-progress and get notified by email when a new chapter is added. \0/ HURRAY!

Importing features for Open Doors

One of the key aims of the Archive of Our Own is the preservation of fannish history. As part of this, we want mods to be able to import archives they maintain elsewhere, so they can rescue at-risk archives. We've now finished the code which will allow this! \0/ Working with the Open Doors committee, we'll be importing the Smallville Slash Archive as our first test case in the next month or so. For more on Open Doors and the archive importing feature, check out the Open Doors FAQ. Stay tuned for more news!

Improvements to our HTML parser

Those of you who have been with us for a while will know that improving our HTML parser, which cleans up your HTML when you post a work, has been an ongoing project. In this release, coder Rebecca has included a bunch of improvements, which include fixes for some of the most annoying bugs - no more br tags insinuating themselves into your code where they aren't wanted! HTML parsing is a very, very tricky area, and although we've tested as thoroughly as possible we can't test every possible use case. So, if you find that it does something undesirable to your code, please get in touch with Support.

Known Issues

If you've been using the site at all recently, you've probably noticed that we've been hit by a lot of performance issues. We've written a bit about what's going on and what we're doing to address it in our recent post on performance. The good news is that the new subscriptions options should actually help a bit - a lot of the weight on the site is people hitting index pages to see if anything has been posted in their fandom and/or if the WIP they're following has been updated. Going directly to a work is much less work for our servers, so giving people new ways to track what they want to read is likely to ease the performance issues rather than exacerbating them. The performance issues are our highest priority right now, so we'll be working on them for the next wee while.

We've had a few reports of email notifications not getting through. It looks like some ISPs may be marking some of our emails as spam: we're looking into the issues, but adding do-not-reply@archiveofourown.org to your email contacts list will help ensure our messages are treated as legitimate.

See our Known Issues page for other current issues.

Release Details

Features

  • Added an option to subscribe to individual works - you can now be notified when someone uploads a new chapter to a WIP! \0/
  • Added feeds for all fandom tags! \0/
  • Removed 'Series:' information from works which aren't part of a series when displaying feeds.
  • Added chapter count to 'Share' option and subscriptions information.
  • Added option to add bookmarks to collections.
  • Added new code for importing archives: people with the archivist role can now import works for other users, and the owners of the works will be notified by email
  • Added a comments field to icons, so people can give credit and other information

Bug fixes

  • Fixed importing from Livejournal (formatting was broken by the recent changes on their end)
  • Ensured epub mimetype headers are no longer compressed - should fix a bug causing Internet Explorer to download epubs as zip format, and should allow mobile browsers to recognise the content correctly.
  • Made database index for user accounts unique to prevent duplicate accounts
  • Fixed some outstanding bugs with our banner notices, so that they can now be dismissed by users
  • Parser and sanitizer
    • Fixed the issue where the parser would add an unwanted <br /> tag after various HTML tags
    • Fixed the problem where if there were some paragraph tags in the text, the parser wouldn't correctly add more when you edited.
    • Improved the parser's ability to handle mixed uppercase & lowercase HTML
    • Parser now respects blank spacing inside pre tags
    • Made it possible to assign more than one class to an HTML tag on a work
  • Tags and tag wrangling
    • Fixed bug where wranglers were unable to create tags with accented characters if a tag with the same name in unaccented characters already existed
    • Fixed a bug where tag hierarchical trees on wrangler pages did not display properly
  • Front end fixes
    • Made link styles in #footer more consistently defined
    • Fixed issue causing buttons to display weirdly on chapter reorder pages
    • Fixed bug causing filters to overlap inbox contents on mobile skin
    • Fixed display issues on the static version of fandom pages
    • Added some spacing on the static version of works pages so that kudos buttons aren't too close to text
    • Changed styling of :focus to make it more consistent and visible
    • Fixed header icon display so that it doesn't overlap post new/login buttons in Chrome and Safari
    • Input and h3 on reorder series page both set to full width

Comment

OTW action on SOPA/PIPA

Published: 2012-01-17 18:22:09 -0500

The internet has been abuzz recently with comments about the 'Stop Online Piracy Act' (SOPA) currently under debate in the US House of Representatives, and its counterpart the 'Protect IP Act' (PIPA) in the Senate. Organizations such as the EFF and the Library Copyright Alliance have raised concerns that the bills - which are ostensibly aimed at curbing 'rogue' foreign sites - have significant implications for the web internationally, and will work to curb free speech and online creativity.

Here at the OTW, we've been following developments since the bill was first mooted. SOPA has particular implications for sites which include user-generated content because of the broad language in the bill. This means that it has the potential to negatively affect many popular fansites - including the Archive of Our Own and Fanlore - if it is implemented in its current form.

Following protests from many groups, the Obama administration issued a statement which was seen by the New York Times as a significant blow to the proposed legislation. Nevertheless, the EFF argues that it still poses a significant threat.

In order to make sure that members of the US Senate and House of Representatives understand the problematic nature of the proposed legislation, many sites around the internet are taking part in an 'internet strike'. The OTW will be joining this day of action with a banner on the Archive of Our Own and a blackout on our main website, transformativeworks.org. If you are a US citizen, we urge you to contact your representatives and senators to let them know how you feel about these bills.

Comment

Performance issues on the AO3

Published: 2012-01-17 17:05:30 -0500

As many users will no doubt have noticed, the AO3 has been experiencing some performance issues since the start of the year. When we posted on 5th January, we were expecting those problems to ease once the holiday rush was over. However, that hasn't turned out to be the case. We're working on ways of dealing with the performance issues, but we wanted to keep you updated with what's going on while we do that.

Why the slowdowns?

In the past month, over 2000 new users have created accounts on the Archive. At the same time, the number of people reading on the Archive - with or without accounts - has been steadily growing. This has been part of a general trend, as you can see if you look at the graph showing number of visits to the Archive since November:

Line graph showing number of visits to the AO3, November to January. The line gradually goes up (with spikes on Sundays) before peaking dramatically on Jan 2.

We're always much busier on Sundays, but the number of visits has been gradually going up each week since November (and the same holds true for the preceding months). However, before December we were hovering around the 135,000 level for visitor numbers at peak times. You can see that the visitor numbers began to climb more dramatically in December, peaking on 2nd January when we had 182,958 visitors. Crucially, after that spike it didn't drop back down to anything like the levels it had been at previously: we're now at more than 150,000 visits on a regular day, and more than 165,000 on Sundays, our busiest day. Wow!

We were expecting a big spike over the holidays, when there are lots of challenges and lots of people with a little spare time for reading and creating. However, we hadn't expected site usage to remain quite so high after the holidays were over! The increases mean that the site is now under a holiday load every day, which is one reason things have been running a little slowly.

The other reason for the slowdowns is that the increase in our number of registered users, and the holiday challenge season, has produced a big increase in the number of works. In fact, 11,516 new works have been posted since the end of December already! More data in our databases means more work for things like sorting, searching, etc - this means that sometimes the database just doesn't serve up the result you need in time, and the unicorn which is waiting to get that result gives up and goes away (yes, really - our servers are assisted by unicorns :D).

We've been expecting this general effect for a while now, and we've been working towards implementing things to deal with it; however, we weren't expecting quite such a big jump in site usage in the past month!

What are you doing about this?

The Accessibility, Design & Technology and Systems Committees had a special meeting on Saturday to discuss ways of dealing with the immediate problem, as well as longer term plans. It can be tricky to test for high load situations before they actually occur, but once they do occur there's lots of data we can gather to help us address the most crucial issues. (We're also working on implementing more tools which will help us test this stuff before it comes up.)

Short term

More caching: We already cache pages (or sections of pages) across the site - this means we store a copy which we can serve up directly, instead of creating the page every time someone wants to use it. If something changes, then the cache is expired and a new, updated copy is created. Hitherto, we've focused on caching chunks of information which are unlikely to change rapidly: for example, on any works index the 'blurbs' which show the information about each work are cached. However, some of the heaviest load is caused by rapidly changing pages like the works index. We're moving towards more caching of whole pages, so that a new copy of the works index (for example) will be created every five minutes rather than generated each time someone asks for it. This means things like works indexes will be a little slower to update - when you add a new work, it won't appear on the list until the cache expires - but that five minute delay will massively reduce the weight on our servers.

More indexes: We have a few places in our databases - for example the tables for the skins - which could use more indexes. Indexes speed things up because the server can just search through those rather than the whole table. So, we're hunting out places where more indexes are needed, and implementing them. :)

Medium term

Bad queries must die: We have a few queries which are very long and complicated, and take a long time to run. We need to rewrite these bits of the code to make them simpler and faster! In many cases this will be quite complicated (or else we would have done it already), but it's a priority to help us speed things up.

New filters for great justice: The filters that are implemented on our index pages are not really optimal considering the size of the site now - the limitations of that code are the reason we have to have a 1000 work cap on the number of works returned. We have been working on this for a long time - we need to completely throw out what we have and implement a system which works better for the site as it is now. Again, this is really complicated, which is why it's taken us a long time to achieve it even though we knew it was important - the good news is that we have now done quite a lot of work on this area and the first round of changes should be out in the next few months.

Long term

Long term, we're going to be moving to a setup which allows us to distribute our site load across more servers. This will involve database sharding - putting different bits of the database on separate servers - so it will take quite a lot of planning and expertise. If you're a user of Livejournal or Dreamwidth, you might be aware that your journal is hosted on a certain 'cluster' - we'd be moving to a similar system. We want to make sure we do this right, but based on the way the site is growing we think this is now high priority, and our Systems team are working to figure out the right ways forward.

Summary

We know it's really frustrating when the site runs slow or is timing out on you: many apologies. We really appreciate users' patience while we deal with the issues. As you'll see from the above, there are some immediate things we can do to ease the problems, and we also have a good sense of where we need to go from here. So, while these changes need to be implemented as a matter of urgency, we feel confident we will be able to tackle the problems. If you have expertise in the areas of performance, scalability and database management, we would very much welcome additional volunteers.

As we move forward on dealing with problem spots on the site, we may implement some changes which are visible to users: the caching on the index pages and the changes to browsing and searching are two of the most obvious. We'll let you know about this as we go along - we think the effect will be beneficial for everyone, but do be prepared for a few changes! You can keep up with status and deploy news on our Twitter @AO3_Status.

While the growth in the site means we're facing some problems a little sooner than we expected, we're really excited about the fact so many people want to read and post to the AO3. Thanks to everyone for your fannish energy - and apologies for the fact we sometimes slow you down a little.

Comment

Rush hour on the AO3!

Published: 2012-01-05 09:59:30 -0500

As many of you will have noticed, we had some site slowdowns and 502 errors over the first couple of days of the year. Apologies for the inconvenience! If you've run into this problem and been wondering what was going on, you might be interested in this:

Line graph of the last visits on the AO3, 4 December to 3 January. The graph peaks sharply on 1st January

Yes, it looks like lots of fans decided to celebrate the New Year with some delicious fanworks. On Monday 2nd January we had 182,958 visits, and over 1,066,216 pageviews! Furthermore, an octopus swam off with our servers - volta_arovet's Texts From Cephalopods has had 46,301 hits at the time of writing! So, our servers had plenty of work to do!

Over 2000 new users have joined the Archive in the last couple of weeks, and we have hosted several great challenges, including Yuletide (2598 works!), Due South Secret Santa (a more modest 34 works), and Homestuck Ladyfest Exchange (124 works). So, while we're sorry to have had some slowdowns, overall we are super pleased with how well our shiny new servers have held up - those of you who were with us during the holiday season in previous years will remember that the high traffic of holiday challenges made our old servers very sad.

Looking forward, we're not too worried about performance in the immediate future - there are some code improvements we know we need to make which will improve matters a lot, so those will be high priority. If the AO3 continues to expand at the same rate as this year, we will be looking at more servers sooner rather than later. But in light of the graph above, we're pleased that while we certainly slowed down, we didn't grind to a halt! Thanks to all the coders and sysadmins who did the work to make this possible, and thank you to all the OTW members whose donations helped us buy those hardworking servers (we are always grateful for volunteers or donations)! And, of course, thanks to everyone who reads and posts on the AO3 - we're excited to welcome so many of you!

Once again, apologies to those of you who have been affected by the slowdowns - but hurray for so much beautiful fannish activity!

Comment


Pages Navigation