Skip navigation

A Real-World SQL Azure Implementation

An inside look at Quosal’s migration to Microsoft’s cloud computing platform

Cloud computing is a relatively new concept in the SQL Server community, so it shouldn’t come as a surprise that there are still a lot of questions surrounding the migration process, data security, and cloud application performance. You can read all you want about how the cloud computing host providers’ environment should work, but if you’re like most DBAs, you need to hear about how it’s working for an actual cloud customer before you begin considering it for your environment. Therefore, SQL Server Magazine is providing you with an inside look at how migrating to the cloud—Microsoft SQL Azure, in particular—went for Quosal, a quote and proposal software provider. We spoke with Stephen Yu, the vice president of development at Quosal, about how the migration process went, how migrating data to the cloud directly affected employees, and the advantages and disadvantages they’ve found to working in SQL Azure.

SQL Server Magazine: Can you give our readers a brief overview of Quosal and your overall experience with the Window Azure platform and SQL Azure?

Stephen Yu: We were founded in 2005 basically to provide a better quoting solution. About 40 percent of the business that’s done in this country is done through quotes and proposals of some kind, the rest being retail or specialty types of business. So we were looking to have this quoting tool that allows people to prepare quotes and proposals easily. My role at the company has been to do the technology research and figure out what [cloud computing] platform we want to adopt, so that’s kind of how we landed on the Microsoft Windows Presentation Foundation of Microsoft .NET, and then SQL Azure as our back-end cloud offering.

We actually were one of the first testers for Azure, so we were able to migrate existing customer data, and [deploy] new customers to SQL Azure, pretty much as soon as SQL Azure was released, which was the beginning of last year. We’ve found it to be a very scalable tool. The main concern for us was whether we were going to do our own hosting around the world or leverage the infrastructure Microsoft had already put in place for their cloud hosting solution. We found that the SQL Azure infrastructure was very effective for us, especially getting into markets in Europe, South Africa, and Australia.

SQL Server Magazine: Can you discuss the migration process and the specific data you migrated to SQL Azure?

Yu: We originally architected Quosal just on a SQL Server back end and SQL Server Express. I think the original version was on SQL Server 2005 and then upgraded to SQL Server 2008. The actual data comprises a couple of different components. The majority of the data in our business objects are things like CRM data from the various CRM packages we integrate to, such as Microsoft Dynamics and Salesforce, so basically customer name, contact information, and the opportunity information on that customer—how hot of a prospect they are, and whether or not they’ll close. The other key piece of information is product data. We have customers across quite a few verticals, but as a general rule most of the product data that comes in to us comes in via real-time XML feeds from various distributors. An example of that would be Tech Data or Ingram Micro in the IT space. They sell parts, say, to Staples or one of our other customers, so we pull in that information from these distributors and populate our database with it. So the kind of information we’re talking about is product description, product brochures in either PDF or image format, such as JPEG. Obviously pictures large and small and thumbnails, and quite a few of the manufacturer marketing descriptions, which reside in Notes text type fields. Each quote varies in size, but you’re probably looking at somewhere between 500KB and 1MB on the high end for a really big quote and on the low end probably 20KB to 100KB of data.

As far as migration goes, the main reason that we were able to migrate to SQL Azure so quickly and effectively is because we built in our own mapping tool. What happens is when the Quosal product is updated, it updates its own schema. One of the big dilemmas we had was, Can we really go to this platform if we have to have our customers manage a remote SQL Server database? And the answer was No. So, because we had this automatic schema updater, we were able to say, “Well, in this case we’re going to use the tool itself to manage this database update.” So really the only time we interact with the Azure database is from basically a web portal or a SQL Server Management Studio perspective during the initial installation. After that, Quosal maintains its own schema updates, checks for the indexes that are needed as it’s running, and recreates them on the fly. Because of that flexibility in our product, we were really able to leverage the transition to SQL Azure and the actual conversion process because it almost looks seamlessly like a SQL Server 2008 database. It probably took about a day or two of coding and/or slight modifications to our connection strings, things like that.


SQL Server Magazine: It sounds like it’s almost seamless. What advantages and disadvantages have you found with going to the cloud?

Yu: Certainly the main advantage we’ve seen going to the cloud is we have a lot of clients whose sale teams are distributed. They’re not all in one location. And the advantage of going to the cloud is that they’re able to access their data on the road from wherever they are. In our original deployment model, what we had to do was build a self-hosted IIS server that could use web services to provide that additional data layer so that remote clients could connect to that host server for all their remote sales teams. And that’s actually a fairly big implementation in terms of both time and infrastructure costs having to put up a web server and run IIS and all the hardware and backups and things like that. So, from that perspective that’s been a huge advantage to cut that entire layer out because with SQL Azure we’re able to directly connect from anywhere as long as we have the connection info and as long as the firewall rules allow that specific IP address or that IP address range to access it.

Obviously I want to highlight both advantages and disadvantages. It’s not all been a complete home run. I think the main disadvantage that we’ve seen is that at least as of right now, the backups for SQL Azure are very difficult to do. In fact, that’s probably an understatement. They rely on a couple of different technologies that have been somewhat cobbled together. So basically a backup of the local SQL Server database is fairly easy to do. You have multiple ways to do it, including basically detaching a local copy of the database file and copying it and putting it onto your backup and disaster recovery system. Because the database is in the cloud and it’s multi-tenanted, that’s not an option, at least not out of SQL Server Management Studio. So, in order to do the backup, there are a couple of tools that semi-automate the process, but the basic idea is that you have to export the schema for the database to be copied to a text file (.sql), then with that schema create the target database, and then use the SqlBulkCopy class (System.Data.SqlClient) data copy method to export the data, which means that you’re really looking at having a local SQL Server [system] anyway in order to back it up because you’re backing up your Azure database to the local SQL Server [system]. Or, you can back up the remote Azure database to another remote Azure instance, and that works as well. But then the reliability is a little bit suspect; if, for some reason, all of the Azure databases went down or the actual portal to get into the databases went down, you would be offline.

SQL Server Magazine: Do your customers actually create their own SQL Azure accounts?

Yu: Yes. We were originally looking at reselling SQL Azure to them and kind of managing their accounts for them, but we just found that the overhead wasn’t worth the fee that Microsoft offered for it, and it didn’t give the customers full control of their database. We wanted them to feel like they own their own database, which they do, and that we don’t have access to it unless they choose to grant us that access, so that’s an important component because they set it up themselves. They grant us access to do the initial implementation, but then it’s up to them to change the passwords, change the firewall filters, so that we’re allowed in or not allowed in as they please, and to do their backups, which we do provide them. As I mentioned, the backups are kind of a mess; we have actually provided them some very detailed instructions and some additional tools we’ve made to help them along in that backup process. I do want to add the caveat that I believe Microsoft is working on better backup tools, in general. I just don’t believe they’re in general distribution yet.

SQL Server Magazine: Have you received any feedback from your customers regarding having their data hosted on SQL Azure?

Yu: Actually, I would say that the best feedback is that they don’t notice a difference. A lot of our customers were migrated from a local SQL Server instance that they ran themselves. So we would definitely hear feedback if there was something wrong. The fact that we don’t is generally a very good indication. The times we do hear feedback, it has been generally very positive: that they’re happy, their distributed sales teams are able to access this data, and they don’t need to be on the VPN to do that self-hosted IIS model that we talked about earlier.

SQL Server Magazine: Do you have feedback on the availability of the data? Has SQL Azure gone down for a long period of time?

Yu: That’s kind of a tough question. I don’t know the exact Microsoft SLAs—I’m sure I can pick up the docs—but I’m pretty sure they’re within their SLAs; however, there have been outages—major outages—that have happened. Let’s say that SQL Azure’s been out for a year. I remember distinctly three major outages, one of which was an almost 24-hour period. So I can’t say with 100-percent certainty that it’s been up all the time, but the availability has been generally pretty good, and at least the portal site has been pretty good about telling us that it’s down. So, we’re able to say, “Oh, OK, so the availability of these servers is not there right now,” and we’re able to communicate that to our customers. But, yeah, when that’s down, they don’t really have a way to reach their data at that moment. And the backups are typically not something they want to work against during that time. The fortunate thing is that your quoting proposal, while it’s really important to get that out in a timely manner, it’s usually not what would be considered a work stoppage if your quoting tool isn’t online at that moment. And we do have basically a local instance of Quosal running against SQL Server Express, so a lot of our customers, even if they are completely offline, can at least get quotes out. They may not be linked up to their old quote data and they may not be able to do comparison pricing with how they did some of their previous quotes, but they’re still able to stay up because of the local SQL Server Express instance as well.

SQL Server Magazine: What’s the average database size in your installation?

Yu: That really depends on how many quotes the customer does. We average a little less than 1MB per quote if the customer uses a lot of graphically intensive quotes. Some tend to be a lot more text-based, in which case we’re looking more at about a 200KB quote. But we have customers of all sizes, from multibillion-dollar corporations to one-man shops, so I can’t really say that there is an average size, but there is an average quote size, so it just depends on the number of quotes. I would say our typical installation is five-user sales teams, and typically they’ll generate along the lines of maybe 1,000 quotes a year. So that’s less than a gigabyte a year usage [on average] for the median installation.

SQL Server Magazine: Did your company have any concerns before implementing the SQL Azure instances? Any security concerns? And how were those concerns addressed by Microsoft?

Yu: We did have quite a few security concerns—obviously, this is pretty sensitive data. And actually a lot of our vendor partners were concerned that their real-time pricing didn’t get into the wrong hands, I guess would be the best way to describe it. So, those were some of our major concerns. In addition, because the database is accessible from anywhere, we wanted to make sure that there was a way to lock down access to the data. We internally encrypt all of the key pieces of information that are company-specific. In addition, most of the traffic to the Azure database [can be] over a secure SSL connection, although I’m guessing the majority of our customers don’t have it set up that way, mostly due to speed considerations and the fact that we do encrypt the data in the database itself. But that’s certainly a concern.

I would say the main thing that Microsoft has done to alleviate a lot of our security concerns is allowing us to control firewall rules so that we can decide, Well, this is the only range of IP addresses that are really allowed to access the database. We haven’t encountered any unauthorized use of any of these databases in our install base of probably about 600 companies/database instances with 1 to 500 users per company.

SQL Server Magazine: Some of our readers are beginning to look into SQL Azure and other cloud offerings, but they’re still pretty hesitant about what it means for their jobs. How has migrating to SQL Azure affected Quosal’s employees?

Yu: I’m not sure that I can say it directly affected Quosal employees that much. It has affected our customers a lot more than our employees. For employees, it’s been more [about] relearning the technology and the processes. They’re slightly different. We have a pretty rigid basic implementation package that we sell to new customers; the project managers needed to be trained to deploy to the SQL Azure environment rather than an internal customer database. So, it was a pretty drastic retraining process. We no longer can expect our customers to have a SQL Server [system] already configured and running for us; we need to know how to do those sorts of things. So there was a level of training, but especially after a year now, our project managers have definitely gotten to the point where they’re very comfortable doing these sorts of installs and understanding all the pieces that need to be checked off.

SQL Server Magazine: Were there any differences moving to Azure, as far as the back end? Was anything different from the development side of things for your applications, or was it pretty much the same?

Yu: We were somewhat lucky because, as I mentioned before, we abstracted a big layer of the database connectivity side. Nothing in our application actually writes direct SQL, so that helps. Well, the mapping tool, which we built, writes all the SQL. So that helped a lot because all we had to do was change the mapping tool slightly. There are actually a few SQL differences in SQL Azure, a very minor one. I believe it’s because of the multi-tenanted fashion; some of the indexing is slightly different. But from the UI perspective, and from a development perspective on our UI layer, we had no differences. Our mapping layer where we generate all the business objects, we made some small modifications—mostly to do with indexes to be able to create them. They had to do with cluster indexes, from what I recall. Because of the multi-tenanted nature of the SQL Azure database, I think some of the clustered indexes didn’t work and we had to make them unclustered.

SQL Server Magazine: It sounds like it was a tremendous success. One of the key things that you mentioned is that your customers didn’t even really realize that there was a difference, and that’s pretty awesome.

Yu: Yes, I can’t reiterate that enough—they didn’t notice that there was a difference. The speed was actually fairly comparable, maybe a little bit slower for some queries, but nothing that was noticeable from the UI side, especially because a lot of the database layers were so abstracted. What happens is the UI interaction is done and then in batches we save the data in one transaction to the database, and the same thing [happens] when we’re pulling data from SQL Azure. So from that perspective customers really didn’t notice. The only time they noticed was when there was downtime, but that was minimal.

I also can’t reiterate enough the hosting capabilities. We’re a small company of 13 people. We have customers around the world, mostly in the United States, but a good chunk in Australia, in the UK and Europe, and a significant number in South Africa. We were looking at hosting facilities in all those locations—adding servers and then doing the co-locations with local Internet providers there—and the cost was, while not prohibitive, definitely an order of magnitude higher than what we ended up having to spend to do this development and then put our customers on SQL Azure. But I definitely would say that’s been part of our success and our customers’ success—being able to roll this out to other countries, pretty much without a hitch.

SQL Server Magazine: Thank you so much; all this real-world information will be very helpful for our readers who are considering migrating data to the cloud.

TAGS: SQL
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish