Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
wwb
Aug 17, 2004

quote:

Its' XOR

Doh.

Dietrich posted:

Do not do that either. Concatenating strings in .net is a truck load of slow. Although this is less bad than the constant idea, it would only suck for you if you were trying to load a hash table with a large number of your objects, or compare your object with a large number of other objects or something.

Gotcha, thought about that issue when writing the post. Luckily most cases of this don't end up in collections with more than a half dozen instances. Will try and use fancy xors in future.

Adbot
ADBOT LOVES YOU

ljw1004
Jan 18, 2005

rum

Dietrich posted:

I too enjoy having to make sure that the GetHashCode implementation is not poo poo every time I use a class in a collection.
If you're writing a throw away test class, sure. If you're writing anything else, just take the extra half a second and return the hash code of the logical key.

The point is that writing a correct GetHashCode is hard. It's the source of a huge number of bugs. It will usually take longer than half a second to understand all the issues and ramifications (e.g. you should base it only on fields that won't be mutated, but often you can't use "readonly" for them due to other CLR constraints, so you need to trace through your code to make sure of immutability invariants).

Far better to take that half second to write a simple and guaranteed-correct GetHashCode, and do the work for a better-performing one only if your profiling tells you to.

Dietrich
Sep 11, 2001

ljw1004 posted:

The point is that writing a correct GetHashCode is hard. It's the source of a huge number of bugs. It will usually take longer than half a second to understand all the issues and ramifications (e.g. you should base it only on fields that won't be mutated, but often you can't use "readonly" for them due to other CLR constraints, so you need to trace through your code to make sure of immutability invariants).

Far better to take that half second to write a simple and guaranteed-correct GetHashCode, and do the work for a better-performing one only if your profiling tells you to.

I can't disagree with you strongly enough.

No Pants
Dec 10, 2000

Dietrich posted:

I too enjoy having to make sure that the GetHashCode implementation is not poo poo every time I use a class in a collection.

If you're writing a throw away test class, sure. If you're writing anything else, just take the extra half a second and return the hash code of the logical key.
If this is an issue in your code base, implement a coding standard. A hash code is used for so few things that it isn't an issue in ours. I already do everything you've advocated here, but I'm not going to say a coworker did something objectively wrong because I want to use his class for something it wasn't designed for.

boo_radley
Dec 30, 2005

Politeness costs nothing

No Pants posted:

I already do everything you've advocated here, but I'm not going to say a coworker did something objectively wrong because I want to use his class for something it wasn't designed for.

Yeah, making sure your code does just what it needs to is probably for the best. :cheers:

ninjeff
Jan 19, 2004

Dietrich posted:

I can't disagree with you strongly enough.

  1. An incorrect GetHashCode will cause logic errors in collection methods.
  2. An unoptimised GetHashCode may cause noticeable performance issues in your program.
  3. An unoptimised GetHashCode that turns out to be a problem can be optimised in isolation without changing the program's design (unlike some optimisations).
  4. Developer time is valuable and developers make mistakes.

Therefore, use a simple but known correct implementation for GetHashCode until you have a clear performance problem.

No Pants
Dec 10, 2000

boo_radley posted:

Yeah, making sure your code does just what it needs to is probably for the best. :cheers:
Okay, let's add my hashing class to this project because I have a twenty-property class that will never be used as a key in a hash-based collection, ever. That's a level beyond premature optimization.

Edit: Sorry, I was mad about coding.

No Pants fucked around with this message at 21:34 on Dec 22, 2011

boo_radley
Dec 30, 2005

Politeness costs nothing

No Pants posted:

Okay, let's add my hashing class to this project because I have a twenty-property class that will never be used as a key in a hash-based collection, ever. That's a level beyond premature optimization.

I was agreeing with you, you big pantsless nerd! If it doesn't really matter to the scope of your project, don't bother.

ljw1004
Jan 18, 2005

rum

boo_radley posted:

It might be making your fields read-only, or extending your class to account for mutable fields (ex. hash caching), but there has to be a better option than a static hash value.

Could you give an outline of what you mean by "hash caching to account for mutable fields"? I've been mulling it over for a while but I couldn't think of any useful hash caching scheme which satisfies the basic requirement:

If two instances might ever in the future return True when you call .Equals() on them, then they must have the same hash value right now.



(of course, if the invariants in your code are such that no one will ever mutate the fields, then that satisfies the basic requirement... but it's not the mutable fields that you mentioned).

ljw1004 fucked around with this message at 22:37 on Dec 22, 2011

Dietrich
Sep 11, 2001

No Pants posted:

Okay, let's add my hashing class to this project because I have a twenty-property class that will never be used as a key in a hash-based collection, ever. That's a level beyond premature optimization.

Edit: Sorry, I was mad about coding.

You don't want to equate every property on a class to see if they are supposed to be references to the same entity in most cases. Hope this helps.

boo_radley
Dec 30, 2005

Politeness costs nothing

ljw1004 posted:

Could you give an outline of what you mean by "hash caching to account for mutable fields"? I've been mulling it over for a while but I couldn't think of any useful hash caching scheme which satisfies the basic requirement:

This is an interesting and fair question. I started off with the idea that a dictionary contains some amount of metadata about its constituents. As a brief trip in reflector shows, this was a terrible assumption. At the heart of it is:

code:
    private Entry<TKey, TValue>[] entries;
(...)
    [StructLayout(LayoutKind.Sequential)]
    private struct Entry
    {
        public int hashCode;
        public int next;
        public TKey key;
        public TValue value;
    }
So right up front, I need to say that the dictionary class is a lot cleaner than I expected -- in fact, between the next field and looking at HashHelper, I think it's just doing linear or quadratic probing for collision resolution straight out of TAOCP. I'd based my idea on the idea that internally, the structure would be a list of some sort rather than an array, and that's clearly not the case. As it stands, my idea is a non-starter.

So what follows is my own mulling on the idea of working with mutable values in a collection. This is pretty academic and it probably doesn't go anywhere interesting for you, but the exercise was neat.

I started with some ideas about how a list-backed dictionary would do the caching, but as the design progressed, I wound up with the dictionary being the mediator for its items, ex:
code:
dictionary.SetValue(key, value)
and that's a terrible implementation, I think. It runs counter to the progression of C# as a fluid and intuitive language.

The existing implementation could be altered (I won't say improved here) for this case by the following:
1. Implement internal storage as a linked list -- this has the benefit of being able to move items to different hash buckets as values change.
2. Have an INotifyPropertyChanged or some other observer pattern watching the key/ value pairs and modifying the internal buckets as k/vs change -- moving or deleting nodes as needed.

Strict 9
Jun 20, 2001

by Y Kant Ozma Post
I'm having a hell of a time getting TeamCity to work with deployment, and would kill for some help. I think I've spent about 15 hours on this so far.

Right now I'm at the point where I have to build steps, one for the solution one for MSBuild. Both succeed. In the working directory, everything looks fine. But in the directory where the website is deployed, it's missing about half of the files. Specifically all the .aspx.cs and .cs files are missing, but the .aspx files all seem to be there.

Any ideas?

Plorkyeran
Mar 22, 2007

To Escape The Shackles Of The Old Forums, We Must Reject The Tribal Negativity He Endorsed
The .cs files aren't supposed to be deployed.

biznatchio
Mar 31, 2001


Buglord

No Pants posted:

Okay, let's add my hashing class to this project because I have a twenty-property class that will never be used as a key in a hash-based collection, ever. That's a level beyond premature optimization.

code:
public override int GetHashCode()
{
    throw new NotImplementedException();
}
At least that way you know something's wrong if anyone ever tries to use it as a hash key; rather than simply hoping nobody ever does and signing them up for a very surprising debugging experience if they do.

The Gripper
Sep 14, 2004
i am winner

biznatchio posted:

code:
public override int GetHashCode()
{
    throw new NotImplementedException();
}
At least that way you know something's wrong if anyone ever tries to use it as a hash key; rather than simply hoping nobody ever does and signing them up for a very surprising debugging experience if they do.
This is pretty much the only way to do it if it *has* to be done you aren't going to use it. We had this problem once where a previous developer had half-implemented a few methods then decided that since he wasn't using them it didn't matter what they did. We eventually had to use them and figured "they're implemented, we'll assume they're working" and had some confusing debugging experiences trying to track down the issue. Having it throw a NotImplementedExecption would have sorted us right out from the start.

wellwhoopdedooo
Nov 23, 2007

Pound Trooper!

biznatchio posted:

code:
public override int GetHashCode()
{
    throw new NotImplementedException();
}
At least that way you know something's wrong if anyone ever tries to use it as a hash key; rather than simply hoping nobody ever does and signing them up for a very surprising debugging experience if they do.

I absolutely agree with this, provided there's good error handling elsewhere in your app. That is, if your application should be able to continue running (possibly in a degraded state, or unable to execute a few functions), it should report this error and move on. I personally love the poo poo out of exceptions, but poorly done they're worse than not at all.

Strict 9
Jun 20, 2001

by Y Kant Ozma Post

Plorkyeran posted:

The .cs files aren't supposed to be deployed.

Oh. Well that explains part of it then. I'm still getting a missing reference though, but I'm guessing that's something different. Thanks!

wwb
Aug 17, 2004

Are you deploying /bin/*? Missing reference to what?

Typically this means you have referenced some library in your app and said library isn't getting deployed. Oftentimes because the assembly is gac'd on your machine. Checking copy local can help.

Zhentar
Sep 28, 2003

Brilliant Master Genius
Is there a good way to detect memory pressure conditions? Part of my application can end up creating a very large number of objects that can be re-generated with pretty low overhead. I'd like to come to some middle ground between crashing when I could easily free a gigabyte of memory, and always having to aggressive dump things needlessly to avoid crashing in rare cases.

Sedro
Dec 31, 2008
That's a tough one. Unlike Java, CLR doesn't have any concept of a "soft reference," so either the GC collects everything ASAP or you run out of memory and crash. A simple answer may be to pick a magic number of objects to keep around as strong references. You might look into the Caching Application block (although deprecated) or HttpRuntime.Cache to figure out what techniques they use.

raminasi
Jan 25, 2005

a last drink with no ice
I'm trying to marshal some of these from C#:
code:
struct foo {
    some_enum type;
    union {
        bar as_bar;
        baz as_baz;
    } unioned_rep;
};
Is this the correct representation?
code:
[StructLayoutAttribute(LayoutKind.Explicit)]
struct Foo
{
    [FieldOffset(0)]
    SomeEnum Type;
    [FieldOffset(sizeof(SomeEnum))]
    Bar AsBar;
    [FieldOffset(sizeof(SomeEnum))]
    Baz AsBaz;
}
Assuming all the field definitions are correct anyway - I'm concerned with getting the union right. Something's not marshalling correctly and this seems like the most likely culprit.

raminasi fucked around with this message at 06:46 on Dec 28, 2011

Sedro
Dec 31, 2008
Looks good to me. Check the packing and make sure that Marshal.SizeOf(typeof(Foo)) is correct.

Dessert Rose
May 17, 2004

awoken in control of a lucid deep dream...

Sedro posted:

That's a tough one. Unlike Java, CLR doesn't have any concept of a "soft reference,"

Sure does: System.WeakReference

Destroyenator
Dec 27, 2004

Don't ask me lady, I live in beer

Dessert Rose posted:

Sure does: System.WeakReference
Be aware the GC may (will?) function differently in debug and release builds.

Sedro
Dec 31, 2008
Soft reference vs. weak reference

A soft reference is somewhere in between a strong and weak reference. Under low memory pressure, they function as strong references. Under high memory pressure, they function as weak references.

Zhentar
Sep 28, 2003

Brilliant Master Genius
I hadn't really considered weak references... I've only used them previously for event handlers, which wasn't much fun, but I'd forgotten that was because of the event handler part, not the weak part. Still, I don't think they're the right choice. From the documentation, and some discussion, it doesn't sound like a WeakReference will cause an object to survive a generation 0 garbage collection, in which case my cache would be getting flushed every few milliseconds. I may still give it a try though; my initialization is cheap enough that it still could be good enough.

Currently, I'm thinking the best approach is to go with a cache collection that will use System.GC.WaitForFullGCApproach and then flush itself. Oh, except that won't work if concurrent garbage collection is available. Crap.

Sedro
Dec 31, 2008
I did some poking around for you. System.Web.Caching.Cache internally uses GlobalMemoryStatusEx and some magic numbers to determine the memory pressure. It might also be using timers. Hope you like getting your hands dirty.

Edit: You can also use MemoryFailPoint to better predict and handle an out-of-memory situation.

Sedro fucked around with this message at 20:39 on Dec 28, 2011

epswing
Nov 4, 2003

Soiled Meat
Ok, I'm not sure how this binding scenario is supposed to work.

Say my MainWindow has the following viewmodel

code:
public class MainWindowViewModel : INotifyPropertyChanged
{
    public ObservableCollection<Item> MyItems { get; set; }

    public MainWindowViewModel()
    {
        MyItems = new ObservableCollection<Item>();
    }
    
    ....
}
and in the xaml, I declare my UserControl

code:
<Window ...>
    <local:MyControl Items="{Binding MyItems}" />
</Window>
and in the UserControl I have an Items DP

code:

public ObservableCollection<Item> Items
{
    get { return (ObservableCollection<Item>)GetValue(ItemsProperty); }
    set { SetValue(ItemsProperty, value); }
}

public static readonly DependencyProperty ItemsProperty =
    DependencyProperty.Register("Items",
    typeof(ObservableCollection<Item>), typeof(MyControl), new UIPropertyMetadata(null));
and in my UserControl's view model I have an ObservableCollection<Thing> which needs to be cleared and repopulated whenever Items changes. But I can't just Items.CollectionChanged += (s, e) => { run repopulation code }; in the UserControl's constructor because Items is null before a binding is made.

This is mostly just a braindump, it might not make sense, I can put together a (non)working example if necessary.

epswing fucked around with this message at 20:39 on Dec 28, 2011

Sedro
Dec 31, 2008
You can use a constructor on UIPropertyMetadata to create an event when Items changes.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Sedro posted:

I did some poking around for you. System.Web.Caching.Cache internally uses GlobalMemoryStatusEx and some magic numbers to determine the memory pressure. It might also be using timers. Hope you like getting your hands dirty.

Yeah, screw that, it's not worth that much trouble.

Hmm, I could go with a hybrid of the weak reference and magic number approach - use a collection with a small, safe maximum capacity using strong references, evicting things to a collection with weak references. As long as they survive long enough to get promoted to at least generation 1, the weak objects should be reasonably long lived, and if they don't, I'm still not any worse off than just using the magic number.


Sedro posted:

Edit: You can also use MemoryFailPoint to better predict and handle an out-of-memory situation.

Still not ideal, but it comes closer than anything else I've seen. I may well make use of that.

aBagorn
Aug 26, 2004
Would it be beneficial for me to attempt to learn C# and ASP at the same time? With some Sharepoint integration thrown in?

boo_radley
Dec 30, 2005

Politeness costs nothing

aBagorn posted:

Would it be beneficial for me to attempt to learn C# and ASP at the same time? With some Sharepoint integration thrown in?

I would recommend starting with C#, transitioning to ASP.NET (fairly quickly if you're comfortable with other languages). Get to know ASP.NET to a good degree before tackling SharePoint. While SharePoint is based on ASP.NET, it's much larger and complex. It also imposes a certain rigor on development that web developers seem to chafe against. Well, it doesn't exactly impose that rigor, but there's a lot of ways that you can really damage a SharePoint farm if you don't know what you're doing and stray away from best practices.

If you want some insight into SharePoint development, start here for the developer's overview and in particular read Microsoft SharePoint Foundation as an ASP.NET Application and the sections in ASP.NET vs. SharePoint: How Development Differs.

aBagorn
Aug 26, 2004

boo_radley posted:

I would recommend starting with C#, transitioning to ASP.NET (fairly quickly if you're comfortable with other languages). Get to know ASP.NET to a good degree before tackling SharePoint. While SharePoint is based on ASP.NET, it's much larger and complex. It also imposes a certain rigor on development that web developers seem to chafe against. Well, it doesn't exactly impose that rigor, but there's a lot of ways that you can really damage a SharePoint farm if you don't know what you're doing and stray away from best practices.

If you want some insight into SharePoint development, start here for the developer's overview and in particular read Microsoft SharePoint Foundation as an ASP.NET Application and the sections in ASP.NET vs. SharePoint: How Development Differs.

Thanks. This will be my first foray into developing, and I've put a little bit of time into C# already (a lot of basics at this point, still learning) and one of my bosses was talking with me about it and said that they would look into budgeting a new Jr Developer position in for me if I felt comfortable enough with the .NET Web side.

Need to get out of help desk hell, you see.

biznatchio
Mar 31, 2001


Buglord

Zhentar posted:

Still not ideal, but it comes closer than anything else I've seen. I may well make use of that.

You could always use finalizers to continually resurrect cache items when memory pressure is low.

Quebec Bagnet
Apr 28, 2009

mess with the honk
you get the bonk
Lipstick Apathy
Is GhostDoc Pro really worth it? I've been using the free edition and it's been working fine, is it worth shelling out?

Factor Mystic
Mar 20, 2006

Baby's First Post-Apocalyptic Fiction

Factor Mystic posted:

Is it possible to have a gradient run the length of an arc in WPF? I have seen this post which seems to indicate a goofy amount of code is required (so, par for WPF) but the example is a geometry path, not an arc, so I'm hoping to find some trick that will make it simple.

Anyone have a thought on this, from last page?

raminasi
Jan 25, 2005

a last drink with no ice

GrumpyDoctor posted:

I'm trying to marshal some of these from C#:
code:
struct foo {
    some_enum type;
    union {
        bar as_bar;
        baz as_baz;
    } unioned_rep;
};
Is this the correct representation?
code:
[StructLayoutAttribute(LayoutKind.Explicit)]
struct Foo
{
    [FieldOffset(0)]
    SomeEnum Type;
    [FieldOffset(sizeof(SomeEnum))]
    Bar AsBar;
    [FieldOffset(sizeof(SomeEnum))]
    Baz AsBaz;
}
Assuming all the field definitions are correct anyway - I'm concerned with getting the union right. Something's not marshalling correctly and this seems like the most likely culprit.

Quoting myself because I think I found the problem: in the native struct, unioned_rep has an offset of eight bytes, not four. Is the best practice here to force the alignment on the native struct or the managed one?

Sedro
Dec 31, 2008
Are you compiling for x64? It's probably putting the union at word boundary. It should have a 4 byte offset on x86.

If so, the fix is to use a named union instead of trying to use LayoutKind.Explicit to mimic an anonymous union:
code:
[StructLayoutAttribute(LayoutKind.Explicit)]
struct FooUnion
{
    [FieldOffset(0)]
    Bar AsBar;
    [FieldOffset(0)]
    Baz AsBaz;
}

[StructLayoutAttribute(LayoutKind.Sequential)]
struct Foo
{
    SomeEnum Type;
    FooUnion UnionMember;
}
In any case, make sure you test this with both platforms.

raminasi
Jan 25, 2005

a last drink with no ice
It's x86, and the problem is that it's a 4-byte offset, but the native structure has an 8-byte offset. I'm going to need to "manually" fiddle with the alignment on one of them, but I'm not sure which one is a better plan.

e: here's MSDN on the structure alignment compiler switch:

MSDN posted:

When you specify this option, each structure member after the first is stored on either the size of the member type or n-byte boundaries (where n is 1, 2, 4, 8, or 16), whichever is smaller.
8 is the default, and the union is 48 bytes.

raminasi fucked around with this message at 01:03 on Jan 1, 2012

Adbot
ADBOT LOVES YOU

ljw1004
Jan 18, 2005

rum

GrumpyDoctor posted:

It's x86, and the problem is that it's a 4-byte offset, but the native structure has an 8-byte offset. I'm going to need to "manually" fiddle with the alignment on one of them, but I'm not sure which one is a better plan.

When I'm doing anything with persistent or shared binary data structures, I always am completely explicit about every single drat piece of code that reads or writes that data-structure. I never use "sizeof" or the like. I always give it exact byte offsets (and specify whether it's big-endian or little-endian!). It feels like that if it's a binary format that will persist or be transferred between modules, then it sure as heck needs to be carefully documented, and the code to read it has to be paranoid to avoid danger from malicious or malformed input.

  • Locked thread