AO3 News

Survey Sunday #6: OTW Website

Published: 2012-07-29 06:46:50 -0400

A short update on Survey Sundays

We've been working on a rough schedule for our Survey Sunday reports to give the committees involved time to review their data and make sure we systematically cover all questions. Unfortunately, this means that we may not be able to answer your Survey Sunday inquiries in a timely fashion, as the related post might be scheduled for weeks or months off. We've done our best to push posts which people showed particular interest in to the top of the pile, but this hasn't always been possible. Although we warned about possible delays in our announcement post, we wanted to give you a specific heads up about this.

In short: we're still very happy to take questions, but we will integrate them into our regular releases, which might mean long delays. Thanks for your patience!

About the answers

Today we’ll be answering questions #84 to #88. These relate to the OTW website ( usage, design and content. We've prioritized preparing this data so that our webmasters can factor the results into their planned OTW website redesign. All questions presented in this post are complete and include final numbers, including the last three open (i.e., free-text response only) questions.

It’s important to note that everyone who replied to Question #84 saying that they hadn’t read the site before or that they didn't remember having done so would have skipped all the other site-related questions and automatically been redirected to the last question in the survey, #89, though this fact will also be mentioned when relevant in the corresponding sections.

Additional notes on Freeform Questions #86-#88

In these questions users were given a text box in which to enter their opinions. This type of question obviously requires more time and labor to answer than one with limited choices. This fact, along with the skipping logic, the fact that these questions were at the end of a long survey, and possibly a lack of awareness of the OTW website likely contributed to these questions having a small number of answers compared to the rest of the survey. (Only 4-5% of total survey respondents answered these questions.) Many categories thus represented the opinions of only a very small portion of the respondent pool, so we have chosen not to represent these results as percentages, preferring to focus on the actual counts.

We categorized the answers to these questions by assigning one or more categories to each (similar to tagging) rather than simply dividing responses into distinct bins. Some comments were quite lengthy and included several different ideas, suggestions, and concerns. For this reason, the number of answers in each category will often add up to a number greater than the total number of respondents who answered a question.

84. Have you read the OTW website ( before?

Answer Options Response count Response percentage
Yes 1,772 35.7%
I don't remember 1,145 23.1%
No 2,046 41.2%
Answered question 4,963
Skipped question 1,023

82.9% (4,963) of the people who took the survey answered this question. People who answered either "I don't remember" or "No" were immediately redirected to the last question in the survey (question #89), while people who skipped this question, or answered "Yes", were directed to the following questions related to the site.

Only 35.7% of the people who answered this question chose "Yes", indicating that they had read the OTW website. 41.2% said "No", and 23.1% said they didn't remember.

graph for question 84, description in the text above.

85. Based on your previous visits to the OTW website (, how strongly do you agree or disagree with the following statements?

Answer Options Strongly agree Agree No particular opinion Disagree Strongly disagree Response Count
The OTW website made it easy to find the information I was looking for. 167 748 640 127 24 1,706
The OTW website is easy to navigate. 178 776 600 126 22 1,702
The OTW website met my expectations as a user. 185 766 648 79 18 1,696
Answered question 1,707
Skipped question 4,279

Only 28.5% (1,707) of survey respondents gave an opinion on at least one of the statements. Of the 4,279 people who skipped this question, 3,191 did so automatically by answering "No" or "I don't remember" to question #84. This means that 18.2% of the people who took the survey skipped this question purposefully. Taking into account that only 2,795 people would have seen this question (1,772 who answered "Yes" on question #84 and 1,023 who skipped question #84), about 38.9% of the people who saw this question skipped it.

In general, response count for each statement was nearly identical, and a high percentage of people skipped the question purposefully compared to other sections and questions. The percentage of people expressing a particular opinion on each statement had few variations. Given that the statements are related (particularly "easy to navigate" and "made it easy to find the information I was looking for" – the third option is more ambiguous), this is not overly surprising.

Below we'll analyse each statement separately.

The OTW website made it easy to find the information I was looking for.

Answer Options Response Count Response percentage
Strongly agree 167 9.8%
Agree 748 43.8%
No particular opinion 640 37.5%
Disagree 127 7.4%
Strongly disagree 24 1.4%
Answered question 1,706
Skipped question 4,280

28.5% (1,706) of people who took the survey gave an opinion about this statement. This is almost 100% of the people who answered question #85 in any form (1,707).

53.6% (915) agreed in some manner (whether strongly or not), 37.5% (640) expressed neutrality by choosing "no particular opinion", and 8.8% (151) disagreed in some manner (whether strongly or not).

graph for question 85 first answer, description in the text above.

The OTW website is easy to navigate.

Answer Options Response Count Response percentage
Strongly agree 178 10.5%
Agree 776 45.6%
No particular opinion 600 35.3%
Disagree 126 7.4%
Strongly disagree 22 1.3%
Answered question 1,702
Skipped question 4,284

28.5% (1,702) of people who took the Survey gave their opinion about this statement (about 99.7% of the people who answered this question in some manner).

53.1% (954) agreed in some manner (whether strongly or not), 35.3% (600) expressed neutrality by choosing the "no particular opinion" option, and 8.7% (148) disagreed in some manner (whether strongly or not).

graph for question 85 second answer, description in the text above.

The OTW website met my expectations as a user.

Answer Options Response Count Response percentage
Strongly agree 185 10.9%
Agree 766 45.2%
No particular opinion 648 38.2%
Disagree 79 4.7%
Strongly disagree 18 1.1%
Answered question 1696
Skipped question 4290

28.3% (1,696) of people who took the Survey gave their opinion about this statement; this is about 99.4% of the people who answered this question.

56.1% (951) agreed in some manner (whether strongly or not), 38.2% (648) expressed neutrality by choosing the "no particular opinion" option, and 5.8% (97) disagreed in some manner (whether strongly or not).

graph for question 85 third answer, description in the text above.

#86 - What should we improve about the website, in your opinion?

Answer Category Response count Response percentage
No Opinion 114 30.4%
Mistaken Site 14 3.7%
No Improvements Necessary 24 6.4%
Suggested Improvements/Stated Issues 223 59.5%
Answered question 375
Skipped question 5,611

Question #86 was first of the 3 free-form answer questions in the OTW website section, and used the same skipping logic as 85. Thus, only 2,795 people saw the question. Out of this number only 375 (13.42% of those who saw it) answered it in any form. (See the note at the beginning of this post for a description of how answers to this question were categorized.)

Out of 375 respondents, 114 answered to clarify that they had no opinion, 14 gave feedback for a different site (often AO3), 24 stated that no improvements were necessary and that the site was fine the way it was, and 223 pointed out issues with the site or offered suggestions to make it better.

graph for question 86, description in the text above.

Suggested Improvements

Answer Category Response count
Organization/Navigation 100
Searching and tagging improvements 40
Missing Content 33
Front page content and general page layout related issues 33
Visual concerns 29
Content Presentation 26
Other 15
Total Number of Responses with Suggested Improvements/Stated Issues 223

223 survey takers left comments suggesting site improvements or describing site issues.

The largest number of comments involved the overall organization and navigation of the site (100 answers) and the navigation/searchability of the blog (40 answers). Users expressed frustration with not being able to find the content they were looking for, even though that content is present on the site. Some users were only able to find the information after a Google search of the site, others said they reached it in ways they did not expect or that they wouldn't have found the content if they didn't know it was there in first place. There were several suggestions that information on volunteering should be made easier to access and find, and that the tagging of posts should be improved.

Another common theme was the organization of the front page and the general layout (33 answers). Here one of the biggest points was users not seeing the search box. Some users noted it was hard to find, others asked for a search box, having overlooked it. Other frequent topics included adding some static information in addition to the blog and making the path to access project links and information shorter.

A third common type of feedback was requests for improvements to visual aspects of the site (29 answers). These fell into two main themes. The most frequent request was a review of the color scheme of the site both for accessibility and aesthetic reasons. Others requested that the OTW website layout be more lively, with images.

A fourth common topic included missing content and content presentation. 33 comments involved content being missing or hard to find, including information about the OTW, its projects and committees, and its finances, as well as information about helping the Org by volunteering, joining, or donating. Note that some users identified multiple types of missing content, giving a total of 67 answers.

26 comments were related to content presentation. Within the content presentation feedback two frequent issues mentioned were language clarity on the textual content, and a request for more charts, videos, pictures and general visual cues to make the content more appealing and easier to read

The remaining 15 comments covered a variety of topics not mentioned above, such as improvements to the calendar feature, performance issues, search engine optimization and accessibility.

graph for question 86 suggested improvements breakdown, description in the text above.

Breakdown of missing/hard to find content mentioned

Answer Category Response count
Projects/committees 21
About the Org/about the site/financial information 19
Volunteering/membership/donation 16
Elections 11
Total Number of Responses with Suggested Improvements/Stated Issues 223

While the nature of missing content was the focus of question #87, several respondents specified types of content they had tried to access but could not find, had difficulty finding, or felt could be clarified further in answer to this question. The most commonly mentioned content was information on projects and committees (21 answers), information about the OTW as a whole, such as general structure and finances (19 answers), information on volunteering, donating and membership (16 answers) and elections (11 answers).

graph for question 86 missing/hard to find content breakdown, description in the text above.

87. What information would you expect to find on the OTW website that's not currently there?

Answer Category Response count Response percentage
N/A - No Opinion 98 38.7%
Mistaken Site 6 2.4%
Nothing particular/all set 43 17.0%
Suggested content/improvements to content 106 41.9%
Answered question 253
Skipped question 5,733

Question #87 used the same skipping logic as #85 and #86; thus, only 2,795 people saw the question. 253 (less than 10%) of them decided to answer it.

Of the 253, 98 of them answered to clarify they had no opinion on the topic, 6 mistakenly gave feedback on specific OTW projects (often AO3), and 43 answered that they didn't think anything in particular was missing. 106 answers contained suggestions to add new or improve existing content.

graph for question 87, description in the text above.

Nature of content improvements requested

Answer Category Response count
Has missing content/can use more information 87
There but hard to find 4
Can use more updates 21
Total number of answers with content suggestions 106

Although the question was specifically asking about missing content, the responses covered a variety of concerns. 87 answers mentioned missing content or asked for more detailed content. 4 answers asked for more frequent updates to particular content. 21 respondents stated that the information they were looking for was there but hard to find.

graph for question 87 nature of content improvements requested, description in the text above.

Breakdown by type of expected information not found

Answer Category Response count
Other topics 46
Legal 18
Volunteering/donating/membership 17
Finances 15
Project 13
About the Org/its structure 12
Committes/Board 11
Total number of answers with content suggestions 106

The topic with the greatest number of requests for more information was the legality of fandom and previous cases (18 answers). It is important to note that half of these (9 answers) were specifically requesting information on legal issues with an international scope. Second most frequent request on information was volunteering and donating. Users were looking for more information on volunteering and making donations (17 answers). Specific requests included more current and detailed information on what types of volunteers are needed, what new volunteers should expect, and what is the difference between becoming a member, volunteering, and donating.

Other common requests were more detailed information on OTW finances and how donations are put to use (15 answers), more detailed and up to date information on projects (13 answers), more information on how the OTW is structured, and information on committees and the Board (11 answers).

The remaining answers contained a variety of content requests such as more information on staffers and volunteers as people, FAQs, translations, resources for existing volunteers, or better access/merging of information originating from various OTW communications outlets.

graph for question 87 breakdown of information expected but not found on the OTW site, description in the text above.

88. Do you have any other feedback about the OTW website?

Answer Category Response count Response percentage
N/A - No opinion/no further comments 127 54.3%
Mistaken site/off topic 13 5.6%
Further suggestions/comments 94 40.2%
Answered question 234
Skipped question 5,752

Question #88 asked for additional comments on the OTW website on topics that weren't covered by the previous two questions. Since the question was under the same skipping logic as #85, #86 and #87, only 2,795 people saw it. Out of these, 234 (less than 10%) chose to answer this question.

Of these answers, more than half (127 answers) were to state that the respondent had no further comment or that they had no opinion. Answers that only referred to previous questions by number (e.g., "what I said in question 86") with no further comments were also categorized here.

13 of the comments were feedback on other OTW projects, general commentary on the OTW, or other off topic content such as frustration with the length of the survey (this was the question-before-last of 89 questions).

94 answers included further suggestions or general comments on the website.

graph for question 88, description in the text above.

Nature of feedback on the website

Answer Category Response count
Specific criticism or suggestions 53
No other issues/overall happy 30
Thank you and encouragement 14
General dissatisfaction 5
Positive comments on specific features or aspects 4
Total number of answers with further comment 94

The most common type of feedback given in response to this question was to suggest improvements to the site (53 answers). The most common topics mentioned for improvement were organization and issues about the look of the website.

30 users used this space to state that regardless of their previous suggestions or comments they were overall happy with the site. 14 users thanked the website team or sent them encouragements. 5 answers contained non-specific dissatisfaction with the site as a whole, and 4 answers included comments on particular aspects of the site that worked well for the respondent.

graph for question 88 breakdown by nature of feedback, description in the text above.


Happy SysAdmin Day!

Published: 2012-07-27 11:04:27 -0400

Happy SysAdmin Day! Here at the OTW, we'd like to take the opportunity to say thank you to our fantastic Systems team!

Systems work tirelessly behind the scenes to make everything work smoothly for the whole organisation.

What you see

Screenshot of a tweet reading 'The #AO3 will have some planned downtime on Thursday 26 July for some software upgrades:'

What Systems do

Maintain the servers for the Archive of Our Own, Fanlore,, Transformative Works and Cultures, Open Doors, and our internal wiki; install server software; arrange the installation of new hardware; find solutions when Fanlore is hit by a wave of spam; optimise performance on the AO3; wake up in the middle of the night to fix things when our servers melt; maintain and administer web development environments for our trainee coders; research and consult on hardware purchases; answer endless technical questions so Communications can post and tell people what's happening; pull stats to help us understand more details about our projects; set up new software tools; and much, much more.

Because of Systems, fandom can own the servers!

Our Systems committee are super-duper awesome and make it possible for all our projects to exist! <3 THANK YOU for your awesome work and all that you do!

Go go Systems monkeys!

Image of awesome dancing monkey with caption 'Systems mokeys rule, oh yes'


852 Prospect: Coming Soon

Published: 2012-07-26 03:27:45 -0400

Dear Sentinel fandom!

We're writing to let you know that the import of 852 Prospect into the Archive of Our Own (which we announced in May) has been postponed until sometime in August. Our apologies for the delay!

The import has been rescheduled because of recent performance issues on the Archive (which our coders are hard at work fixing). We're now waiting until the next code deploy before we import 852 Prospect, but Open Doors is still working to make sure it happens.

Thank you everyone for your patience, and if anyone has further questions, please don't hesitate to contact Open Doors.


Spotlight on Support: AO3 ticket stats

Published: 2012-07-25 05:07:03 -0400

This is going to be a very boring post. It's going to be full of numbers, and graphs, those things that I may or may not have spent many years at school colouring in with lovely coloured pencil without understanding them much (because I was apparently too much of an innocent mind to turn them into rude, crude approximations of things not related to mathematics except in the most abstract sense), and yet, these will be very easy to understand numbers. I am not a statistician, nor are the levels of data I have access to very deep. What I am is a member of the Support Committee with a curiosity about the numbers and types of tickets that pass through our hands, and who decided to add up the numbers one day and turn them into graphs. That was last year, and somehow the lure of the bar chart means that I have continued to collate information through to where we are right now, having just finished the second quarter of 2012.

In this post, I'm going to summarise the types of tickets received, what categories they fall under, and the general trends we witness. But first, some explanation of the process.

Collecting The Numbers

I'm sure that the method I have used is going to come under some degree of criticism for being inefficient; however, our Support software, provided by 16bugs (see Sam's spotlight post for more information) was not designed for data export. This means that the only way to extract numbers of tickets is to do it manually. And by manually, I mean I go through the email duplicates of each ticket one by one, assigning them a category, then add up the numbers for each month and enter them into an Excel spreadsheet.

What this method is, for all its faults, is quick, which means that I can rapidly pull up a given time period to see what sort of tickets were received between those dates. These graphs were originally created as an informal overview of ticket stats (which is a position they remain in – production of these stats is not an official Support Committee duty). They are simply counts of the original tickets, what they are about, and when they were received. They are not a count of how quickly they were responded to, who responded to what, or what follow ups were conducted with the users.


I'm going to leave direct explanations of the categories until the sections for the respective quarters, as these change on a quarter-by-quarter basis. This is due to the simple fact that new features are added, which generates new issues, and old issues are resolved. For example, squid caching was not implemented until June of this year, so prior to that, it was not shown in the graphs because issues relating to it did not occur. Here I'll instead explain the process by which tickets are categorised.

If you've ever submitted a comment or query to Support you will notice that on our form is a drop down menu.

screenshot of menu options: Bug Report, Feedback/Suggestions, General/Other, Help Using the Archive, Languages/Translation, Tags

These categories are not the ones I have used to sort tickets. Since the categories in the menu are so few and so broad, I felt it necessary to granulate them further, and count tickets as they related to specific archive functions and features.

If a new category is created in my sorting, it's because an issue got a large number of tickets and wasn't a transient bug. For example, if a ticket is related to subscription emails, it is categorised under "Subscriptions", not "Email" because it is related to a specific Archive function (in this case subscriptions) that has an existing category. If it were related to invitation emails, it would go under "Accounts/Invitations/Login". However, if it's related to kudos batching, it goes under general "Emails", because there is no category for kudos.

The Stats

2011, In Brief

I won't linger on 2011 too much (see Q1 2012 for an explanation of categories), since this information was a little more awkwardly hacked together than for 2012 – by which time I had sorted out my process for quickly organising tickets.

bar chart with different colors for every month in 2011, representing absolute ticket numbers for each in 15 different categories
(full size)

Prior to August, tickets were collated by the Support Chair, using slightly different categories than I did. I attempted to meld the two sets of information as best I could to produce the above year overview.

What is easily and clearly visible is the spike in tickets in November, resulting from a change to the front-end presentation of the AO3. The biggest spike is split between Interface/CSS tickets and Feedback. While many of the tickets sorted under Feedback were directly related to the changes to the AO3's interface, they did not contain bug reports or requests for information, and therefore fell under the heading of Feedback.

Q1 2012

Categories for Q1 2012:

  • Error 502 - the 'server busy' messages
  • 1000 Works - queries related to why we have a 1000 work limit on the fandom landing pages
  • Activation/Invitation/Login - problems activating accounts, getting invitations, or logging in
  • Admin/Abuse - issues that need to be examined by Admin or Abuse teams
  • Bug Report - Reports of transient bugs that aren't separately categorised
  • Collection/Challenges/Prompts - any problems/queries about these
  • Downloads - errors, bugs, queries related to downloading
  • Feature Request - any 'can I have/I would like/will you implement' queries
  • Feedback - any complaints, or any positive feedback (alone with no other feature-related issue)
  • Help/Information - any questions about AO3/OTW in general, or how to use specific features
  • Interface/CSS/Display - problems/queries relating to how the archive appears on screen, i.e. interface
  • Imports - issues with importing from LiveJournal/
  • Open Doors - questions related to fics imported through OD
  • Search/Browse/Filter - Problem or queries about sorting through archive contents
  • Tag Wrangling - any tag related questions

bar chart with different colors for Jan, Feb, and Mar 2012, representing absolute ticket numbers for each in 14 different categories
(full size)

Possibly due to the fact that the holidays are still going on at the beginning of January (and thus, people have more time to spend on fandom sites) we saw more tickets in general than during the following two months.

Q2 2012

Categories added for Q2 2012:

  • Embedding – queries/problems with embedding media (images/audio/video) into Works pages
  • Bookmarks – queries/problems involving bookmarking
  • Caching – bug reports that are actually caching issues (e.g., reporting 0 works in a fandom as a bug – this is a caching issue, or appearing as logged in as another user). The kind of caching which causes these particular bugs was only implemented in June.
  • Email – email issues unrelated to other categories (e.g., kudos email batching)
  • Subscriptions – issues/queries to do with the subscribe feature

bar chart with different colors for Jan, Feb, and Mar 2012, representing absolute ticket numbers for each in 14 different categories
(full size)

To break down the invitations emails, in June we received 140 tickets related to Invitations.

  • How Do I Use This Invite: 22
  • Did Not Receive Invitation Email: 31
  • Fell Off Invite List (unaware of security changes): 41
  • (of those, who admitted to re-adding themselves: 6)
  • General Invite Queue Unhappiness: 10
  • Can I have An Invite?: 9
  • I Requested Invites, Where Are They?: 12
  • Paid Accounts: 3
  • My friend on needs an invite: 12
  • Need Invites for a Challenge: 5
  • Please Remove Me From Queue: 1

The remaining 37 tickets in that category were related to account activation or login issues.

bar chart with different colors for each week of June 2012, representing ticket numbers for each in 20 categories
(full size)

This graph shows how the tickets were distributed during the weeks that span the month of June. In week 23 (commencing 4th June) we received the greatest number of queries regarding invites, as this was the point at which the invitations queue started growing at the rate of nearly 1000 new additions per day (a rate since slowed to around 300-odd per day). This coincided with the point at which the AO3 servers started creaking under the strain of lots more visitors and a filtering system that was originally designed with a smaller user base in mind.

When squid caching was implemented to help ease the strain (around week 24) we saw an increased number of tickets related to this change. In week 25, when filtering was disabled, we began to see an increased number of tickets related to that. (Originally, the message was ill-worded, appearing to be an error message, rather than an admin message – this has since been altered, and tickets regarding the filtering being 'down' have disappeared.)

And This All Means...

I always have fun posting these stats to the support committee. Everyone already knows more-or-less how things have gone, but sometimes looking at the numbers surprises us. When I originally created them, one frequent question was "what's the most common ticket you get" to which we would generally reply "queries regarding the 1000 work limit". I was curious as to whether this was actually the case. As it turned out, Feature Requests came in more often. Questions about the 1000 Works came lower down the list.

If you are wondering how many tickets we answer altogether, I can tell you that at the time of writing there are no unanswered tickets in our support software (except for one bugged ticket, which we are attempting to resolve with 16 Bugs). Every single ticket we receive is read and personally answered by a member of our staff, usually within a day or two. So, the answer is: we answer all of them.

graph showing the number of tickets for each month from Jan 2011 (170) to May 2012 (590)
(full size)

This post by Support staffer Yshyn. If you find a bug, have a question about the site, or want to request a feature, you can submit a Support request.


Planned Archive downtime: Server software upgrade

Published: 2012-07-23 07:07:30 -0400

The Archive of our Own will be down for planned maintenance for approximately 90 minutes from 07.00 UTC on Thursday 26 July (see what time this is in your timezone). We'll be upgrading our server software during this time (more details below for the curious!).

We'll keep users updated on our Twitter AO3_Status as the work progresses. Thanks for your patience while we complete this work!

Server software upgrades

This downtime will allow us to upgrade Nginx and MySQL on our servers. It's important for us to keep this software up-to-date in order to avoid bugs and get better performance.

Nginx is web server software which everyone's browser communicates with - when you come to the Archive and make a request for a work, Nginx does the job of communicating with the application and getting the data you wanted. It handles some information itself and passes requests on which are too complex for it.

MySQL is the database which handles all the persistent data in the Archive - that's things like works. We're updating this to a much more recent version of the software, which will bring us some performance gains. We're also moving from the Oracle branch to Percona, which will bring us some additional benefits: it should give better performance than Oracle, and will also give us some additional instrumentation to monitor the database and identify problem areas. In addition, we hope to draw on the support of the company who produce it (also called Percona).

Users shouldn't see any changes after this update. However, we wanted to keep this work separate from our recent RAM upgrade so that if any problems do arise, we will find it easier to identify the cause.


Some of you might have noticed in Support tickets a link to something called "Trello" or might have noticed a new FAQ about "Internal Tools", but weren't sure what these things might be. Or you might've wondered if you're the only who's seen a bug or requested a new feature. Or you might just be curious about our code.

We'd like to introduce the three public tools that we've been using to track development information for the Archive. We've been using them for different lengths of time, but haven't really advertised them. In the spirit of transparency, we thought we'd present them to you! We're hoping these tools will make it easier for you to understand what's going on behind the scenes and what we're working on.

First up is our very organized friend, the Trello Feature Requests Board. He's a detail-oriented individual who loves hearing about where our users think the Archive should go and what features our coders should implement. He records all of the requests for new features or revisions of current features. Browsing his cards, you can see features and changes that have been proposed, ones that have are accepted for development and will be coded eventually, ones that are implemented, and even ones that aren't currently possible on the Archive. He even takes notes on coders' and users' thoughts on the various features. I should note that he might not have labeled some cards clearly, so it's always worth browsing around a little.

Next is the Google Code Issues Tracker. She is our best beta, keeping track of features that have been approved for development on Trello, as well as the bugs our coders, testers, and users have found (all 3200 and counting)! She's been working for us the longest, keeping a list of everything we've broken and everything we've fixed. She's got some categories you can search by, but you can also just search by keywords.

Finally, we have the otwarchive on Github. She's the one in charge of our actual code, where the Archive gets rewritten. She'll take notes if you've figured out how to fix something, and even let you copy out the whole Archive code to work with on your own server. She's also starting to collect notes about our development and design guidelines as the AD&T committee standardizes them for the version 1.0 release.

We have a FAQ with a few more questions and answers about Trello, Google Code and Github, as well as links to all three. You can access it here: Internal Tools FAQ. If you have any questions about any of these tools or about something on them, let us know at our Support form (which also will soon have links to these tools)!

- Sam J., AO3 Support


Planned Archive downtime: RAM upgrade

Published: 2012-07-17 04:37:57 -0400

The Archive of our Own will be down for planned maintenance for approximately three hours from 15.00 UTC on Friday 20 July (see what time this is in your timezone). During this period we'll be installing some new RAM and performing some other maintenance (more details below for the curious!).

We'll keep users updated on our Twitter AO3_Status as the work progresses. Thanks for your patience while we complete this work!


The Beast: cartoon style image of server
Our database server looking grumpy about having too little RAM!

We're doubling the RAM in our database server and in our two application servers. Increasing RAM will help our system cope with more users: for example, it will allow us to run more unicorn workers, which serve up the content you're trying to access. This should help site performance as the site expands.

You can imagine the unicorns lining up in the hall of RAM to fetch you things from the treasure trove of fanworks: if there aren't many unicorns, you have to wait till one can serve you, which sometimes means you get a 502 error. We can increase the number of unicorns to make things go faster for you, but if the hall is too small (there isn't enough RAM) then things get crowded and inefficient and everything slows down again. More RAM allows us to increase the number of unicorns without slowing things down. (For the interested, this more technical explanation of Unicorn isn't exactly the way things are set up on the AO3, but will give you an idea.)

New drives

We're also installing some new drives in our two oldest machines. Both these machines have room for six drives; currently they each have four installed. Information is mirrored on the drives so that if one goes down, the system continues to work. At the moment, one machine has a broken drive. We'll be replacing the broken drive, and at the same time adding two spares to both machines so that we have more backups if anything else breaks.

Our two original machines preparing to nom their new drives
Front end server: cartoon style image of serverSlave: cartoon style image of server


My, how we've grown! A few AO3 stats

Published: 2012-07-16 12:09:58 -0400

We've been talking a lot recently about how much the AO3 has expanded over the last few months. One easy statistic for us to lay our hands on is the number of registered accounts, but this only represents a tiny portion of site activity. Our awesome sys-admin James_ has been doing some number crunching with our server logs to establish just how much we've grown, and provided us with the following stats (numbers for June not yet available). Thanks to hele for making them into pretty graphs!

Visitors to the AO3

Line graph showing the number of visitors to the AO3 per month, December 2010 to May 2012. The line progresses steadily upwards with a significant spike from 1,197,637 in April 2012 to 1,409,265 in May 2012.

The number of unique visitors to the site has increased almost every month since December 2010 (each unique IP address is counted as one visitor). There are a few points where the rate of increase gets more dramatic: there was a jump of 244,587 across December 2011 and January 2012, compared to one of 137,917 over the two months before that. This can probably be accounted for by the fact that during December and January, holiday challenges such as Yuletide bring more people to the site. This theory is borne out by the fact there was a slight dip in the number of visitors during February 2012, indicating that some of the extra traffic in the previous two months were 'drive by' visitors who didn't stick around.

May 2012 saw a steep increase in the number of visitors: there were 211,628 more visitors to the site than there had been the month before! The rapid increase in visitors was not without its price: this was the month of many 502 errors!

Traffic to the AO3

Line graph showing AO3 traffic in GB per month, December 2010 to May 2012. The line progresses steadily upwards with a significant spike from 2192 GB in April 2012 to  2758 GB in May 2012.

The increase in the number of visitors to the site has also been accompanied by an increase in overall site traffic (how much data we're serving up). Again, there's a significant spike during December/January. Interestingly, there's no dip in traffic for February 2012, showing that even though there were some 'one time' visitors over the holiday period, there were also plenty of people who stayed and continued to enjoy fanworks on the site.

The increase in traffic to the site clearly accelerated in 2012. Between January and May 2011 traffic increased by just 159.92 GB; the same period in 2012 saw an increase of 1,870.26 GB! In fact, with an increase of 566 GB during May 2012, that month alone saw almost as big a jump in traffic as the whole of the previous year (595.63GB)!

And the other stuff

With these kinds of numbers, it's not surprising that there've been a few bumps along the way. For information on how we're dealing with the growth in the site you can check out our posts on performance and growth and accounts and invitations.

Many thanks to our dedicated volunteers for their hard work dealing with the growth of the site, and to our fabulous users for their patience with our growing pains - and for creating the awesome fanworks so many people are flocking here to see!


Pages Navigation