0

C#: Ints Vs Stringified-Ints

Spread the love

C# Using Ints Vs Stringified-Ints

This article details the benchmarks and performance showdown in C# using Ints Vs Stringified-Ints.

 

Often enough C# programmers will find themselves using numbers in their string representation. That is, using “8” as opposed to 8.

 

In C# though, strings are 1 byte per character whereas ints are 4 bytes.

 

That means for numbers:

  • less than 1000, strings use less memory than ints and thus should be quicker if arithmetic operations aren’t needed
  • between 1000 and 9999, their memory footprint should be the same and speeds should be roughly equal
  • greater than or equal to 10,000, ints should rule

 

But! The question is – is this true in practice?

 

That’s what this Curious Consultant wanted to know.

 

Do you? Let’s find out!

 

Getting Started

For these benchmarks, we’ll be doing the following common C# operations:

 

The C# is written in Visual Studio 2017 targeting .Net Framework version 4.7.1 x64. The source code is available at the end of this blog so you can benchmark it on your own system.

 

Who won?

Let’s see what happened on my machine. The tests were performed twice, rebooting the computer before each run to clear the slate.

 

All times are indicated in seconds.milliseconds format.

 

Lower numbers indicate faster performance.

 

 

10

1,000

100,000

10,000,000

if “int” comparison

00.0000019

00.0000047

00.0005040

00.0479261

if “stringified- int” comparison

00.0000011

00.0000173

00.0018386

00.2373046

 

switch-case “int”

00.0000003

00.0000158

00.0002180

00.0162536

switch-case “stringified-int”

00.0000007

00.0000169

00.0016154

00.1505063

 

Dictionary add “int”

00.0000537

00.0000114

00.0017856

00.1707078

Dictionary add “stringified-int”

00.0000154

00.0000557

00.0039296

00.5202994

 

Dictionary read “int”

00.0000371

00.0000126

00.0014076

00.1164386

Dictionary read “stringified-int”

00.0000221

00.0000387

00.0045665

00.5322457

 

 

10

1,000

100,000

10,000,000

if “int” comparison

00.0000059

00.0000067

00.0006723

00.0405404

if “stringified- int” comparison

00.0000011

00.0000632

00.0074283

00.2818466

 

switch-case “int”

00.0000007

00.0000059

00.0006016

00.0345193

switch-case “stringified-int”

00.0000023

00.0000628

00.0060483

00.1571686

 

Dictionary add “int”

00.0007948

00.0000450

00.0059282

00.1772322

Dictionary add “stringified-int”

00.0000402

00.0000971

00.0124622

00.4916288

 

Dictionary read “int”

00.0000643

00.0000497

00.0044764

00.1473004

Dictionary read “stringified-int”

00.0000430

00.0001319

00.0143146

00.5096981

 

The Winning Results

Check out those numbers above.

 

Unless someone spots a flaw in my code, when performing around 10 comparisons or add/read Dictionary operations, the stringified values win hands down. Except with the switch-case statements which I found puzzling as I thought strings would have won everywhere.

 

When there’s 1,000 or more comparisons, the performance numbers switched back in favor of ints by clear margins.

 

What does this mean?

Well, using numbers as ints seems like a no-brainer. But there are systems and situations where they need their string-representation to be used.

 

If you need speed and have to micro-optimize every nano-second, in the majority of cases, ints should be used as they performed faster by a factor of 10 in most cases.

 

The Code:

 


Spread the love

David Lozinski