To be or not to Bigpond?

Bigpond is the biggest ISP in Australia.

Back in the 1990s and up to 2001, my ISP was a small company named Healey. They provided good service at very affordable prices.

Unfortunately, they first were bought by another ISP which then went out of business, and I was ‘left on the street’ needing to choose a new ISP quckly.

The reasons why I chose Bigpond at the time was that although I like to give the ‘small guy’ a go, I felt I had done so already and lost because of it. I wanted to have an email address that would be permanent. An email address that was to last for many years to come. Bigpond, though dearer, was affordable, and being ‘big’ ensured that I would not lose my email address because of bankruptcy or similar. They also were offering very good deals if I bundled my telephone, internet and mobile with them.
Email websites such as hotmail, gmail, yahoo mail, etc either did not exist or where in their infancy then.

4 years ago, when it was time to get a mobile phone for Eitan, the advantage of belonging to Bigpond’s parent company, Telstra, was not as beneficial as what was offered by its competitors. 3, for example, offered a very low rate, which included free calls between Eitan’s phone and mine, and that is when the weaning from Telstra/Bigpond began.

I have been reluctant to let go of Bigpond. The main reason actually was the initial reason why I chose them in the first place: A reliable email address that would not change. My fear of changing a relatively long time held email address made me procrastinate the decision. I know from my own experience, that no matter how often you are reminded and asked by somebody to update your address book with a new email address, it does not always get done.
Even if you do update your address book, Microsoft Outlook, for example, will still suggest the old address when you start typing somebody’s name in the To field.

Bigpond were extremely non co-operative concerning allowing me to switch to another ISP AND keep my old address, even if temporarily. Although my account started as a dialup account (Remember what that was?), and their dialup rates are in the order of AU$10 a month, my offer to keep paying that amount just for the email service was refused. They insisted that the only way to keep my email address was to pay in the order of AU$30 a month or more, which I did not find acceptable.

Enter iiNet’s agressive advertising campaign with the new (for me) concept of naked DSL.
I finally bit the bullet and took a slow and calculated approach to the change of address:

  1. I announced the change, and asked everybody in my address book to amend their address book and use my gmail.com address (email me for the exact details, if you have not changed it yet!!!!)
    The reason for choosing gmail was that it is not ISP dependant (I have learned my lesson!).
    I am following what I am doing in real life too: Use a post box as my mailing address instead of the residential address.
  2. For over a month I monitored the old dabas@bigpond.com address, and sending a renewed reminder to those that still were writing to it. I ignored all of those messages that I considered to be spammy – Good riddance!
  3. Most importantly, I made sure to update all of my internet banking and related details to my gmail address. It is surprising how many passwords I had forgotten, and how useful it was to still have my old address available to receive the change.
  4. I repeated this process with any other sites that I have accounts with, such as E-bay, Messenger, Facebook, PayPal, etc. I really hope I have not forgotten anyone, as it will be too late now!

Once Eitan had finished his Year 10 exams, and did not need the internet anymore for study purposes, I initiated the switch to iiNet.

I was warned that the switch could take up to 21 days, and that there was a chance that during that period I would be without internet and without telephone. (GULP!)

In reality, the wait was much shorter. About 5 days all together. Not bad, iiNet!

We did have a few teething problems which were handled very professionally by their staff from a call centre in South Africa. It was fun listening to their strong accents coming all the way from Cape Town. What a difference to Telstra/Bigpond where I consider it so annoying to be greeted by voice recognition software that somehow has difficulty with my voice or my English, and which has me screaming into the phone with ‘I WANT TO TALK TO A HUMAN BEING’ only to be advised that ‘I was not able to understand your response, please try again’ or something similar.

To make a long story short, here is a summary of why I am better off

  Before (Bigpond) After (iiNET)
Monthly Allowance 25GB 30GB + 30GB off peak (to be changed soon to 50GB + 50GB, albeit with a redefinition of off-peak)
Monthly telephone rental $30 $0 – Switched to naked DSL and the calls go via VOIP
Monthly Broadband fixed charge $79 $79
Local calls Charged at $0.20 Free
National calls Charged Free
Home Messaging Charged Free – Voice messages are actually emailed as .wav files to be received immediately by my iPhone! This was an unexpected great extra!
Online Tools (to check your current usage) Delayed information, probably have to wait a few hours to get your exact usage A much better interface, which lets you know nearly immediately how you stand with both your peak and non-peak use
Level of Broadband service I can’t complain. Required very infrequent router reboots Not so good. I have had to reboot the router occasionally.
Help Desk Voice recognition. It takes quite a long time to get to talk to a human being.
Often the person you are talking to, will put you back on hold when you have to wait for a specialist. Long wait for your turn
Immediate human being.
You call, leave name and number and they call you back when it is your turn!
ADSL ADSL 1 ADSL 2(FASTER!!!)

In short, for the time being, I am very happy with my decision to switch!

Posted in Uncategorized | 4 Comments

Discovering LINQ to DirectoryInfo-FileInfo: But LINQ is not SQL

LINQ looks like fun!

Being able to use a SQL similar language for all sorts of different objects really sounds quite exciting, and I just was hoping the opportunity would arise for me to make use of it.

That opportunity came with the need to compare incoming files from a business partner at work.

Essentially we receive files from this particular partner in two folders. Lets name them folder A and folder B. The files in folder A have meaningful information for us, but the files in folder B are simply useless. I wanted to create weekly statistics so I could prove to our contacts at the partner company that over 90% of the files received were of the second type, and maybe they could do something about either joining them into a daily file, or stop sending them all together.

Perfect scenery for the use of LINQ, right?

Assuming the file name and creation date for one of the folders were stored in a SQL table, what I was after would be something like:

SELECT COUNT(*) FROM FolderA where CreationDate > DateAdd(Week, -1, GetDate())

And maybe go one step further: Add grouping, to find out how many files arrived on the different days of the week. (How many on Monday, on Tuesday, etc)

SELECT COUNT(*), DATEPART(dw, CreationDate) as DayOfWeek 
FROM FolderA where CreationDate >= DateAdd(Week, -1, GetDate())
GROUP BY DATEPART(dw, CreationDate)

Furthermore, SQL lets me use the TOP keyword, in case I want the statistics to encompass more than the last week. Maybe I can do my statistics based on the last 10% of data, or so

Now.. I do not have this SQL table, so.. will LINQ be able to give me the same functionality?

 

First attempt:

   1: private static void GetStatistics(DateTime FromDate)
   2: {
   3:     string Location = ConfigurationManager.AppSettings["InboxLocation"];
   4:     DirectoryInfo diInbox = new DirectoryInfo(Location);
   5:     FileInfo[] InboxFiles = diInbox.GetFiles();
   6:     var files = from file in InboxFiles
   7:                 where file.CreationTime >= FromDate 
te
   8:                 select file;
   9:     int count = files.Count();
  10:     Console.WriteLine("Inbox Count: {0}", count);
  11:     Console.ReadLine();
  12: } 

It works!!!

Try it on your own system, it seems to be quite OK.  But on my work situation, where there are thousands of files in the folder… it is SLOW. (Take into account that folder B receives over 1000 files a day, so that the number of files is gigantic.. and growing each week)

Now, in SQL we can create an index, hence we do expect results to be returned within a reasonable time of seconds. But the file system, as far as I know is not built that way.

And although I specifically have a where clause in line 7, single stepping through the code reveals two facts:

  1. The LINQ statement actually only gets executed when line 9 gets hit.
  2. Even though I have a where clause in line 7, ALL Files get scanned through the LINQ statement. The where clause only influences the results that are inserted into var files.

This is the LINQ equivalent of the grouping SQL statement I posted earlier on. (Again, at work it is very slow, defeating its purpose)

   1: private static void GetStatistics(DateTime FromDate)
   2: {
   3:     DateTime dt = DateTime.Now;
   4:     string Location = ConfigurationManager.AppSettings["InboxLocation"];
   5:  
   6:     DirectoryInfo diInbox = new DirectoryInfo(Location);
   7:     FileInfo[] InboxFiles = diInbox.GetFiles();
   8:     var files = from file in InboxFiles
   9:                 where file.CreationTime.CompareTo(FromDate) >=0 
  10:                 group file by file.CreationTime.Date into g
  11:                 select new
  12:                 {
  13:                     Count = g.Count(),
  14:                     Date = g.Key
  15:                 };
  16:  
  17:  
  18:     foreach (var g in files)
  19:     {
  20:         Console.WriteLine("{0} {1}", g.Date, g.Count);
  21:     }
  22:     
  23:     Console.WriteLine(DateTime.Now - dt);
  24:     Console.ReadLine();
  25: }

 

Maybe the solution will be to create two LINQ statements, with the first one decreasing the size of the sample data? How does one translate the T-SQL TOP keyword?

I placed a question on Experts-Exchange asking for help, and learned about Take() (Click here to see the question)

This was my attempt using Take():

   1: private static void GetStatistics(DateTime FromDate, int Top)
   2: {
   3:     try
   4:     {
   5:         string Location = ConfigurationManager.AppSettings["InboxLocation"];
   6:  
   7:         DirectoryInfo diInbox = new DirectoryInfo(Location);
   8:         FileInfo[] InboxFiles = diInbox.GetFiles();
   9:  
  10:         //First we get the files and select only their dates while we order them descending
  11:         var files = from file in InboxFiles
  12:                     orderby file.CreationTime.Date descending
  13:                     select new
  14:                     {
  15:                         Date = file.CreationTime.Date
  16:                     };
  17:  
  18:         //Attempt using TakeWhile.. did not work
  19:         //var files1 = files.TakeWhile((file, Date) => (file.Date >= FromDate && file.Date <= ToDate));
  20:  
  21:         //We take a subset by using Take. Top is the number of files we want to restrict our search on
  22:         var files1 = files.Take(Top);
  23:  
  24:         //We group the results just found
  25:         var files2 = from file in files1
  26:                      where file.Date >= FromDate 
  27:                      group file by file.Date into g
  28:                      
  29:                      select new
  30:                     {
  31:                         Count = g.Count(),
  32:                         Date = g.Key
  33:                     };
  34:  
  35:         foreach (var g in files2)
  36:         {
  37:             Console.WriteLine("{0} {1}", g.Date, g.Count);
  38:         }
  39:         Console.ReadLine();
  40:     }
  41: }

At first I thought that this really improved the time considerably, but after testing further, it was disappointing to concede, that no matter what I am doing, LINQ will traverse through the whole gigantic folder.

The answer is simple: I have to create an index somehow. I have been told that in LINQ you can add your own providers, and probably handle fileinfo indexing there. I have not yet reached such an advanced LINQ stage yet, and probably it will be easier to just load the whole folder into a SQL table, set up an index there, and let SQL do what it knows yet.

I still think LINQ looks like fun!

Posted in .NET programming | Leave a comment

Microsoft Certification: A true measure of knowledge?

I have changed my mind.
Based on my previous experience with the certification tests, it is either my particular circumstances, or Microsoft has considerably raised their standards.

A little background: In September 2004, I enrolled in the Masters of Systems Development program offered by ITMasters and Charles Sturt University. I found this program very attractive because it not only would upgrade my knowledge to Masters level, but also would supply me with industry certification in the form of the then sought after MCSD (Microsoft Certified Solution Developer)

It was after doing very well in one of the required exams Developing and Implementing Web Applications with Microsoft Visual Basic .NET and Microsoft Visual Studio .NET: that my confidence of the certifications as a measure of knowledge being of any value was lowered. Not because I did badly, the contrary is true as can be seen by my results:

With 700 being a pass, a score of 886 was something to be proud of. But the fact is, that at that stage I had nearly NO web developing experience, as my then professional work consisted exclusively of Windows Applications.
Later on, when I did move over to deploying web applications, my lack of knowledge became more apparent as I spent many hours just trying to figure out basic difficulties that were not covered by the exam, extensive as it was!

A year after starting studies, Microsoft released Visual Studio 2005, and the perception was that our curriculum was becoming outdated before being even close to completion. ITMasters, who are in charge of the industry subjects side of the Masters program, did their best to add value to the degree by including 2005 specific subjects as part of the electives, such as MCTS SQL 2005 and UPGRADE: MCAD Skills to MCPD Windows Developer by Using the Microsoft .NET Framework

At the time I was very involved with databases, and the SQL 2005 elective helped me get up to date on that side of things. Although I received warnings that the Upgrade to MCPD exam was difficult, I decided to go for it towards the end of 2007. By that time I had completed all of my other degree requirements, and after 2 years of intensive work with VS 2005, I thought it would be easy to ‘just pass one more silly exam’

Was I wrong! This particular exam is in fact extremely difficult! Because it is an upgrade from VS2003 related certification to VS2005 certification, it is a shortcut by which material from three exams are tested together.
My first shot in December 2007 ended up in a disaster. It was obvious that my until then proven formula to study through the preparation exams provided by Self Test Software was not enough. The questions in the real exam covered material I had never seen before, and although I did not appreciate it then, I agree now that I did not deserve to pass.

As a result of the failure, I bought the Microsoft Press book for the area of the exam that I fared the worst and attempted again. Once in February, and then again in March. This last attempt, just to try to complete in time to qualify for graduation in April 2008. Although improved, these two attempts also ended up in failure.

In desperation, I spent nearly every early morning, and most of my weekends trying to fill in all of the holes in my knowledge that prevented me from completing my degree. This is not an easy task at all, as the basic material covered in the three books comprises of over thousand pages each! And this material is just to give you a background, which has to be researched more on MSDN, and most importantly by practicing and incorporating them into your own programs.

The task is made more difficult by the deficiency of Microsoft in providing you enough feedback about the areas where you have weaknesses. If we compare with the image above, we can see that in that earlier exam there were 7 areas tested, and it is quite clear that there were two areas that require improvement, with possibly only 50% of the questions answered correctly. The feedback I was getting back now, only had three areas, one for each book. I believe Microsoft should change their score presentation and provide more information, subdividing each one of the areas thereby helping the failing candidate identify what exactly he/she needs to improve

Last week, I finally passed! Here is my result:

What is really surprising is the score: just 700?????? I barely passed? Can it be that if I had just one more question wrong I would have failed?

While completing the exam I made a count of the questions I suspected I had wrong. After all, when given 4 multiple choice answers, you can divide them into three categories:

  1. The questions where you know the right answer AND can explain why the other three are wrong. The chances are that either you have it right, or you have fallen into a trap. From my experience with practice exams, 90% of the questions in this group are effectively correct
  2. The questions where you know two alternatives are definitely wrong, but you have difficulty in choosing one of the other two. The chance of answering these ones correctly is 50%
  3. The questions where you have no idea at all what they are talking about. The chance of answering these ones correctly is 25%

Hence, I had quite a good idea on how I was doing. Assuming that all of my type 2 and type 3 questions were wrong, I gathered that I had answered over 80% of the questions correctly. Also the black lines in the image above all fill up more than 80% of the area. I wonder what sinister method Microsoft uses to score these exams? I shudder to think how badly I would feel if the result would have been 699 or whatever the highest non-passing score is. It is scary!

In my opinion, either the scoring of this particular test is faulty, or Microsoft should do a better job on reporting the areas that need further study

In short: If you want to hire somebody based on their Microsoft Certification, check if they have passed this exam! Its value definitely is higher than the previous MCSD incarnations!!!

So, dear reader, do you agree? What is your opinion of Micro$oft’s certifications. Have you had similar experiences? Please leave a comment!

My current collection of Microsoft Certificates can be viewed here:

https://mcp.microsoft.com/authenticate/validatemcp.aspx
Enter Transcript ID 758566 and Access Code daniel12

Posted in Microsoft Certifications | 5 Comments

My first blog entry: Jumping into the LINQ to XML swimming pool!

 

This is a day of firsts.

Not only my first technical blog entry, but also my first .NET 3.5 program, and the first time I use LINQ

I installed Visual Studio 2008 immediately after the Heroes Happen {Here} event in Sydney, but have mostly used it to edit and add to existing .NET 2.0 applications.

LINQ has intrigued me for a while, specially LINQ to XML, but all the demos I have seen so far have been relevant to LINQ to SQL and LINQ to Objects.

Then yesterday, I received an email from Experts Exchange asking me to help out with a question on XML.

I have been a member of Experts Exchange since 2003, when I was struggling with the steep learning curve from VB6 to .NET.

And this learning curve was a hard one to master: Today you just Google a few keywords and you probably will find at least two good examples that will answer any question you have. Not so in those days. The books on .NET were just thick hard copies of the help system, with hardly any good examples to choose from.

My first experts exchange question simply stated: How do I read data from a particular source, then update the same source, or another one? Today this would have been considered a naive question, but in those days, moving from an ADO frame of mind to the disconnected DataSet concept was not easy.

Getting back to the point:

In this particular question, the asker is in a similar situation: He is trying to use LINQ to do the simple task of adding an XmlNode to an existing document. He knows a little LINQ using VB.NET, but is stuck and does not understand the exception his code is causing.

Sounds so simple an easy!

But after two days, and two calls for help from more experienced experts who received an email similar to the one I got, the question still was unanswered.

"Well", I thought to myself, "it is time to get myself acquainted with the new technology". Best way to learn to swim is to jump into the pool.

First Step: Google for "LINQ to XML"

2 minutes later I link to "LINQ to XML – 5 Minute Overview". Sounds quite promising! OK, so now we have XElements and XAttributes instead of XmlElements and XmlAttributes. Must be easy then to just solve the problem with the ‘ol XmlDocument type of objects, but using instances of these classes instead.

First try:

Use part of the asker’s code:

Dim xmlTrack as XElement=
                      <track id="45"> <trackId>45</trackId> </track>

Dim doc as XmlDocument = new XmlDocument() ‘Now lets see how I add an XElement to it.

Intellisense, help me out!!!  No luck…. I am still using the older classes!

Ahh… It is XDocument I have to use… makes sense!

Second try

Dim doc as XDocument = new XDocument()

doc.Load(sPath) ‘sPath points to the document which starts with <tracks><track>….

doc.Add(track) ‘Geee.. This is sooo easy!!! Lets test it and claim victory

Oh Nooo!!!… InvalidOperationException was unhandled. This operation would create an incorrectly structured document. 😦

Ok.. that makes sense. The DocumentElement is <tracks>…</tracks>, and I am adding <track> to the end of the document, hence the document will not have one root element, and hence it is not structured properly.
What I need to do is to add a child to <tracks>… should be easy…. but… There is no .AddChild, or .Children in intellisense!!! mmmm….

Intellisense aided attempts:

doc.AddAfterSelf(track)
doc.AddFirst(track)
doc.FirstNode.AddAfterSelf(track)
doc.LastNode.AddBeforeSelf(track)

All fail. In short, they all try to do the same thing as before as the Documents first (and last) node is still <tracks>. GRRRR. How do I get to the inner <track> node and then use AddAfterSelf or AddLast?

Time to Google for an answer: "XElement first child Dim" (Adding Dim guarantees VB.NET examples, use C# in the search otherwise)

LINQ To XML Samples – DML give me the clue I need and the final code looks like this:

Dim sPath = "tracks.xml"
Dim doc As XDocument

If File.Exists(sPath) Then
    doc = XDocument.Load(sPath)
    Dim track As XElement =
                            <track id="45">
                                <trackId>45</trackId>
                            </track>

    ‘Dim Tracks = doc.<tracks> ***** First attempt. Still does not work. Still same problem as above ******
    Dim Tracks = doc.<tracks>.<track> ‘It is so easy once you get the hang of it! It still is confusing though, but you should compare this with an XPath //tracks/track statement, which will return all <track> nodes
    Tracks.Last().AddAfterSelf(track)
    doc.Save(sPath)
End If

The complete question and solution at Experts Exchange can be seen here

Posted in .NET programming | 3 Comments