Design for Performance: Implementing Web Services Better

0

Written on Sunday, December 30, 2007 by Edwin Sanchez

In a distributed environment where different servers contribute for the processing of a certain task, good design is essential. One point of consideration is your database. If you have a db4o database and you have designed it well, it will give you the expected results. However, something can still slow down your application even if you designed your database well. What else could go wrong? Your middle-tier is another point of consideration. Are you using web services? Is it waiting too long because of a large database query? Let’s consider some design guidelines that can take you to maximum warp.

  • Consider asynchronous calls.
    If you are doing large queries that can let your calling application wait for a long time, this is a must. I will explain this further on the next post.
  • Group different information into one when making web service calls
    You may have data that you need to display in one or more dropdown lists, an existing transaction that comprise several objects, and any other information that you need to query from the database. Depending on your needs, you can reduce the number of calls to a web service just to return all of the objects you need.
  • Cache the results of the web service call
    Output caching is one of the cool things in ASP.Net. How does it work for web services? Suppose you call a web service that returns true if a user is a valid user in your application. You can cache the result of the web service in a specified time and when the same web service call with the same parameters is invoked again, instead of executing it, ASP.Net will get the results from the cache. This improves performance. Implementation is simple:

[WebMethod(CacheDuration=60)] public bool IsUserAuthentic(string userId, string password)


You just need to add the CacheDuration web method attribute and give it a time period in seconds to remain in the cache. In our example, the result will remain in the cache for 60 seconds. Take note, however, not to abuse this feature. This is really cool, I agree. But using too much of this, specially caching large object sets for a long time can be trouble. When caching results, ASP.Net stores it in server memory. It remains in memory for the specified cache duration. If you also have service calls that vary most of the time, it is not a good candidate for caching.

The next time you find bottlenecks in your web services, assess the problem by considering the performance guidelines above.

Importing External Data to db4o

2

Written on Monday, December 24, 2007 by Edwin Sanchez

From time to time, we may experience requirements involving exporting data from one source to another. Today, I will show you how an external data can be exported in db4o.

The following technologies will be used:
· Db4o 6.4
· .Net 2.0 using C#


This post will discuss the following:
· Reading a text file using StreamReader
· Extracting data from a line of record into db4o objects
· Use List Find method to check for existence


Data can come from many different sources. It can be an XML file, Excel File, from an RDBMS, from a web service or simply from a text file. I’m going to show you today about reading a text file as our source data. First, let’s take a look at the format of our sample text file:


“Department””Position””Name””Salary”
“Information Technology””Team Leader””Jun Marfil””30000.00”
“Information Technology””Programmer””Dino De Guzman””25000.00”
“Human Resources””Benefits Supervisor””Shyrel Morco””15000.00”

From the data above, we can see that we need a department object, position object and employee object. The column delimiter is the pipe symbol () and it’s obvious that rows are separated by carriage return and line feed.

In order to process this data, we need to read each line one by one. Each line of record should be read based on the column delimiter, interpreted and then written to a db4o database.

Now, in order to read the loaded text file line by line, we need the StreamReader. To parse a line of record, we will use the Split string method to put the data in an array of string columns.

Before we jump into code, let’s define our classes.


public class Department
{
private string _name;

public Department(string name)
{
_name = name;
}

public string Name
get
{
return _name;
}
set
{
_name = value;
}
}

public class Position
{
private string _name;

public Position(string name)
{
_name = name;
}

public string Name
get
{
return _name;
}
set
{
_name = value;
}

}

public class Employee
{
private string _employeeName;
private double _salary;
private Department _department;
private Position _position;

public Employee(string name, double salary,
Department department,
Position position)
{
_name = name;
_salary = salary;
_department = department;
_position = position;
}

public string EmployeeName
get
{
return _employeeName;
}
set
{
_EmployeeName = value;
}

public double Salary
get
{
return _salary;
}
set
{
_salary = value;
}

public Department EmployeeDepartment
get
{
return _department;
}
set
{
_department = value;
}

public Position EmployeePosition
get
{
return _position;
}
set
{
_position = value;
}
}

Now let’s look into the part of our program of parsing text. We assume that the database is already open at this point.

private void ImportTextFile (IObjectContainer db)
{
string employeeName;
double salary;
string employeePosition;
string employeeDepartment;
string record[];
string rawRecord;
List<Position> positionList = new List<Position>();
List<Department> departmentList = new List<Department>();
Employee employee;

try
{
using (StreamReader sr = File.OpenText(“sample.txt”))
{
while ((rawRecord = sr.ReadLine()) != null)
{
// remove the quotes
rawRecord = rawRecord.Replace("\"", "");

// put into array
record = rawRecord.Split(‘"’);

//avoid the header line
if(record[0] == “Department”)
{
//do nothing
}
else
{ //get column values
employeeDepartment = record[0];
employeePosition = record[1];
employeeName = record[2];
salary = Convert.ToDouble(record[3]);
}
positionList = AddToList(positionList, employeePosition, db);
departmentList = AddToList(departmentList, employeeDepartment, db);

employee = new Employee(employeeName, salary, employeeDepartment, employeePosition);
db.set(employee);
}
db.Commit();
}
}
catch(Exception ex)
{
db.rollback();
throw new Exception(“Error processing text file”,ex);
}
finally
{
sr.Close();
db.Close();
}
}


private List<Position> AddToList(List<Position> list, string name,
IObjectContainer db)
{
Position position;

// search if the name being inserted is already in the list
position = list.Find(delegate(Position pos)
{
return pos.Name == name;
});

if (position == null)
{
position = new Position(name);
list.Add(position);
db.Set(position);
db.Commit();
}
return list;
}

private List<Department> AddToList(List<Department> list, string name,
IObjectContainer db)
{
Department department;

department = list.Find(delegate(Department dept)
{
return dept.Name == name;
});

if (department == null)
{
department = new Department(name);
list.Add(department);
db.Set(department);
db.Commit();
}
return list;
}



Now let’s recap. We have made 3 classes for our objects and it’s pretty straightforward. The method we define for processing the file used several objects worth our concern. We used the StreamReader to process our file and the Readline is the one we used reading the file line by line. We did not try to load all the lines in memory since this will become troublesome when our file is several MBs and GBs big. We used the Split string method in order to parse the string based on the pipe symbol delimiter and put it in an array. Then, we store each and every array element in a variable. Another notable method is the AddToList method. We used the generic List<t> instead of IList so we can make use of the Find method. We need this so we avoid inserting duplicate departments and positions. After parsing the values, we are now ready to save an Employee object in our database.

Our code is rather simple at this point. One point of improvement is to put some visuals to the user like a progress bar to indicate the status of the processing. Or instead of the AddToList method is to use db4o callbacks to check for duplicates prior to commit.

That’s how simple it is to import data from an external data source like a text file into db4o. I hope you find something useful in this post.


Mars Closest to Earth on December 17, 2007

0

Written on Sunday, December 16, 2007 by Edwin Sanchez

Yes, it's true according to space.com. And Mars will have the most attention on December 23. Very cool to space enthusiasts. See this link to take you to the full story. Too bad for me, it will be visible to those living in the Northern Hemisphere. How I wish I was there! I'm really fascinated with space, the solar system and the universe in general. That's why I also love star trek. It feels like the show is taking me there, where no one has gone before. So for those in the Americas, Arctics, and parts of Europe, happy sky viewing!

“Message Received”

0

Written on Thursday, December 06, 2007 by Edwin Sanchez

Db4o has a client-server feature called Messaging. It’s so cool it allows you to execute a process remotely. It has less overhead and the execution is asynchronous. I find it so useful that something like this will fit in with my current project. I need to do something like this:

  • The user will upload a text file thru a browser
    With the sounds of this, it is an ASP.Net application. The text file is delimited and it can be translated into columns of data.
  • After the file has been uploaded, a process should execute to transfer the data in the text file to my db4o database
    Each column of data in the text file will go to an object repository

Sounds simple. However, the text file is several MBs big so it will be a long running process. The user will be very bored uploading and processing the data. So I came up with the following additional requirements:

  • Once the text file is uploaded, an asynchronous operation needs to be done so the processing will actually happen in the server where the db4o database is located.
  • The processing logs certain information so that when the user inquires of it, he can see it in the browser page that displays the current status of the processing.
  • The user can close the browser anytime after the processing starts or go to another page while the processing is being done. He can inquire later on a page where he can see the current status. Wow! This is something that will make the user more productive.
  • In the server, I would like to know when a processing has been triggered and the current status of the processing so I cannot just shutdown the db4o server or do something bad for it.
  • I can also do the uploading of the text file and processing in the server application.

Now this is something more complicated. It made me excited to do this so I started the research. I came thru the db4o reference documentation on Client Server regarding Out-of-Band Signaling. I read it and decided that this is the thing I need. How does it work? Figure 1 shows a high-level diagram of how I did it. I’m used to do distributed apps in my Visual Basic 6 days so I tried something distributed in this case. Explaining the architecture further is out of the scope of this topic.






Now for some explaining to do. I will just cover the part where I’m actually sending and receiving the message. I will make it simple thru explaining with code blocks. I can’t post my entire code for some restrictions with my current employer. For a working example, you can go to the reference section of the db4o documentation. First, the objective of messaging is straightforward: The client will tell the server to execute a process. The message being sent is actually any object that is storable in db4o. With this you create a class in C# that you will use to be sent to the server.

Below is the class that we will use as our message class:

public class MyMessage
{
  private string Message;
  public MyMessage(string message)
  {
    Message = message;
  }
  get
  {
    return Message;
  }
  set
  {
    Message = value;
  }
}


In order to receive messages, you will need a class that will implement the IMessageRecipient. This can be on the same namespace where the server application resides.

public class MyMessageRecipient: IMessageRecipient
{
  void IMessageRecipient.ProcessMessage(IObjectContainer con, object message)
  {
   if (message is MyMessage) // MyMessage is the class we defined above
   {
     // call routines for your long processing and logging any status
   }
  }
}

Before a server can actually receive messages, it needs to be configured that way. There should be something like the code below be implemented in the server application when the server will be started.

IConfiguration config = Db4oFactory.NewConfiguration();
IObjectServer server;
Config.ClientServer().SetMessageRecipient(new MyMessageRecipient())
server = Db4oFactory.OpenServer(config, [yapfile], [port]);
server.GrantAccess([userid], [password]);
// some other code here, if desired

Now, sending messages is simple enough. You can implement something like the code below in the Business Rules layer.

IObjectContainer db = Db4oFactory.OpenClient([hostname], [port]);
IMessageSender sender = db.Ext().Configure().ClientServer().GetMessageSender();
MyMessage msg = new MyMessage(“Some Message”);
sender.Send(msg);

// some other code here, if desired

That’s how it is. You can try it yourself and see how this is a very good feature of db4o when dealing with client-server. Happy Messaging!

Microsoft DTS and db4o

0

Written on Monday, November 26, 2007 by Edwin Sanchez

Microsoft Data Transformation Services (DTS) is a good tool bundled in SQL Server. One of the things you can do with it is to upload data from different file formats like text file and excel file into SQL Server. In this post, I will try to log my experience in comparing it to the program I created in db4o to transfer data from text/excel file to db4o.

I tried to upload 2 files in SQL Server: one is a text file and the other is an excel file. I uploaded it as is based on the columns in the files. The 2 files eventually became tables in SQL Server. I performed a query left joining the 2 tables to find out if an item does not exist in the other. The result was 1,033 records.

My db4o program's objective is to have the same results as in the SQL version. But to my surprise, the results are not the same. First, it gave me more than double the number of records I got from SQL Server. I found out that there are some information in the file that has spaces after them. For example: I'm expecting "FOO" but the data is "FOO " (with one space character after). So I tried to trim them. After that, the result is still not the same in terms of numbers. There are 3 records more in the result from the db4o program. After some scrutinization, I have found out that I forgot an important element: the data is not consistent if it's lower or uppercase strings. (See related post here) I tried converting them all to uppercase and finally, the results are the same.

Here are the lessons learned:

1. DTS removes trailing spaces in your data. Your db4o program should do the same to obtain the same result.

So that means, it "looks" like you uploaded it in SQL Server "as is" but actually, all data are trimmed to remove the spaces. I proved that by inspecting the data in SQL Server with the len() function. It's not the same as the source data in text and excel files.

2. Remember that db4o/C# is case-sensitive.

Maybe this has not fully internalized in me and I forgot it again.

Hope this helps if you have this kind of stuff in your work.
------------------------------------------------------------------------------------------------------
Related Posts:

Things to Watch Out in db4o

0

Written on Monday, November 26, 2007 by Edwin Sanchez

Since I started my project using db4o, I get to encounter things that I was not accustomed to do. I used to do Visual Basic 6 on MS SQL Server, then VB.Net 2003 on the same database product and lastly C#(still on MS SQL Server). I would like to list the things out here that might help new entrants in the object-database world and C# with similar experience as mine:

1. db4o/C# is case-sensitive
I'm used to SQL Server's case-insensitiveness (although you can make it case-sensitive too by changing the settings. See SQL Server 2000 Books Online under the topic How to create a case-sensitive instance of SQL Server 2000 (Setup)). When comparing strings, take note of this or your going to have lunch and dinner dates with the debugger. So those migrating from VB/SQL tools, expect this since Java and C# are case-sensitive.

2. Get out of the relational mind-set
There are many posts on this so I won't go any further explaining.

3. Good performance depends on YOU
Some people may not agree with me but db4o may not perform properly if your code is slow. db4o is very fast as proved by the pole position survey. If you quickly jump to coding, process tons of objects and not reading the documentation on the best practices to ensure performance, then your code will crawl like a snail. I sometimes make my own mistakes by forgetting that performing connections and executing queries inside a while loop is a bad news. This is not in the docs but the db4o core team need not to put it there since this is a bad practice even in the relational world.

4. Missing Features?
Check the documentation or the community forum on specific things that you need. If it's not there, do a work around for a while and much better, contribute to the community if you have great ideas. Things will get better soon, and the core team is very agile. And this is open source. Features will get better by community contributions.

That's it for now. I'll try adding some more as I move on to my quest.

Good day to all


Just Started

2

Written on Monday, November 26, 2007 by Edwin Sanchez

I just started my own blog to establish presence. I will post on subjects regarding programming, development, db4o, the industry I'm in, my interests and other cool stuff that I find across the internet. I will post my schedule from time to time when needed.