|
quote:Its' XOR Doh. Dietrich posted:Do not do that either. Concatenating strings in .net is a truck load of slow. Although this is less bad than the constant idea, it would only suck for you if you were trying to load a hash table with a large number of your objects, or compare your object with a large number of other objects or something. Gotcha, thought about that issue when writing the post. Luckily most cases of this don't end up in collections with more than a half dozen instances. Will try and use fancy xors in future.
|
# ? Dec 21, 2011 18:59 |
|
|
# ? Apr 24, 2024 22:33 |
|
Dietrich posted:I too enjoy having to make sure that the GetHashCode implementation is not poo poo every time I use a class in a collection. The point is that writing a correct GetHashCode is hard. It's the source of a huge number of bugs. It will usually take longer than half a second to understand all the issues and ramifications (e.g. you should base it only on fields that won't be mutated, but often you can't use "readonly" for them due to other CLR constraints, so you need to trace through your code to make sure of immutability invariants). Far better to take that half second to write a simple and guaranteed-correct GetHashCode, and do the work for a better-performing one only if your profiling tells you to.
|
# ? Dec 21, 2011 19:17 |
|
ljw1004 posted:The point is that writing a correct GetHashCode is hard. It's the source of a huge number of bugs. It will usually take longer than half a second to understand all the issues and ramifications (e.g. you should base it only on fields that won't be mutated, but often you can't use "readonly" for them due to other CLR constraints, so you need to trace through your code to make sure of immutability invariants). I can't disagree with you strongly enough.
|
# ? Dec 21, 2011 20:56 |
|
Dietrich posted:I too enjoy having to make sure that the GetHashCode implementation is not poo poo every time I use a class in a collection.
|
# ? Dec 22, 2011 01:24 |
|
No Pants posted:I already do everything you've advocated here, but I'm not going to say a coworker did something objectively wrong because I want to use his class for something it wasn't designed for. Yeah, making sure your code does just what it needs to is probably for the best.
|
# ? Dec 22, 2011 03:51 |
|
Dietrich posted:I can't disagree with you strongly enough.
Therefore, use a simple but known correct implementation for GetHashCode until you have a clear performance problem.
|
# ? Dec 22, 2011 10:06 |
|
boo_radley posted:Yeah, making sure your code does just what it needs to is probably for the best. Edit: Sorry, I was mad about coding. No Pants fucked around with this message at 21:34 on Dec 22, 2011 |
# ? Dec 22, 2011 20:33 |
|
No Pants posted:Okay, let's add my hashing class to this project because I have a twenty-property class that will never be used as a key in a hash-based collection, ever. That's a level beyond premature optimization. I was agreeing with you, you big pantsless nerd! If it doesn't really matter to the scope of your project, don't bother.
|
# ? Dec 22, 2011 21:10 |
|
boo_radley posted:It might be making your fields read-only, or extending your class to account for mutable fields (ex. hash caching), but there has to be a better option than a static hash value. Could you give an outline of what you mean by "hash caching to account for mutable fields"? I've been mulling it over for a while but I couldn't think of any useful hash caching scheme which satisfies the basic requirement: If two instances might ever in the future return True when you call .Equals() on them, then they must have the same hash value right now. (of course, if the invariants in your code are such that no one will ever mutate the fields, then that satisfies the basic requirement... but it's not the mutable fields that you mentioned). ljw1004 fucked around with this message at 22:37 on Dec 22, 2011 |
# ? Dec 22, 2011 22:34 |
|
No Pants posted:Okay, let's add my hashing class to this project because I have a twenty-property class that will never be used as a key in a hash-based collection, ever. That's a level beyond premature optimization. You don't want to equate every property on a class to see if they are supposed to be references to the same entity in most cases. Hope this helps.
|
# ? Dec 23, 2011 01:01 |
|
ljw1004 posted:Could you give an outline of what you mean by "hash caching to account for mutable fields"? I've been mulling it over for a while but I couldn't think of any useful hash caching scheme which satisfies the basic requirement: This is an interesting and fair question. I started off with the idea that a dictionary contains some amount of metadata about its constituents. As a brief trip in reflector shows, this was a terrible assumption. At the heart of it is: code:
So what follows is my own mulling on the idea of working with mutable values in a collection. This is pretty academic and it probably doesn't go anywhere interesting for you, but the exercise was neat. I started with some ideas about how a list-backed dictionary would do the caching, but as the design progressed, I wound up with the dictionary being the mediator for its items, ex: code:
The existing implementation could be altered (I won't say improved here) for this case by the following: 1. Implement internal storage as a linked list -- this has the benefit of being able to move items to different hash buckets as values change. 2. Have an INotifyPropertyChanged or some other observer pattern watching the key/ value pairs and modifying the internal buckets as k/vs change -- moving or deleting nodes as needed.
|
# ? Dec 23, 2011 04:13 |
|
I'm having a hell of a time getting TeamCity to work with deployment, and would kill for some help. I think I've spent about 15 hours on this so far. Right now I'm at the point where I have to build steps, one for the solution one for MSBuild. Both succeed. In the working directory, everything looks fine. But in the directory where the website is deployed, it's missing about half of the files. Specifically all the .aspx.cs and .cs files are missing, but the .aspx files all seem to be there. Any ideas?
|
# ? Dec 23, 2011 21:19 |
|
The .cs files aren't supposed to be deployed.
|
# ? Dec 23, 2011 22:20 |
|
No Pants posted:Okay, let's add my hashing class to this project because I have a twenty-property class that will never be used as a key in a hash-based collection, ever. That's a level beyond premature optimization. code:
|
# ? Dec 24, 2011 03:44 |
|
biznatchio posted:
|
# ? Dec 24, 2011 04:01 |
|
biznatchio posted:
I absolutely agree with this, provided there's good error handling elsewhere in your app. That is, if your application should be able to continue running (possibly in a degraded state, or unable to execute a few functions), it should report this error and move on. I personally love the poo poo out of exceptions, but poorly done they're worse than not at all.
|
# ? Dec 24, 2011 22:26 |
|
Plorkyeran posted:The .cs files aren't supposed to be deployed. Oh. Well that explains part of it then. I'm still getting a missing reference though, but I'm guessing that's something different. Thanks!
|
# ? Dec 26, 2011 02:21 |
|
Are you deploying /bin/*? Missing reference to what? Typically this means you have referenced some library in your app and said library isn't getting deployed. Oftentimes because the assembly is gac'd on your machine. Checking copy local can help.
|
# ? Dec 26, 2011 18:44 |
|
Is there a good way to detect memory pressure conditions? Part of my application can end up creating a very large number of objects that can be re-generated with pretty low overhead. I'd like to come to some middle ground between crashing when I could easily free a gigabyte of memory, and always having to aggressive dump things needlessly to avoid crashing in rare cases.
|
# ? Dec 28, 2011 01:18 |
|
That's a tough one. Unlike Java, CLR doesn't have any concept of a "soft reference," so either the GC collects everything ASAP or you run out of memory and crash. A simple answer may be to pick a magic number of objects to keep around as strong references. You might look into the Caching Application block (although deprecated) or HttpRuntime.Cache to figure out what techniques they use.
|
# ? Dec 28, 2011 01:46 |
|
I'm trying to marshal some of these from C#:code:
code:
raminasi fucked around with this message at 06:46 on Dec 28, 2011 |
# ? Dec 28, 2011 03:30 |
|
Looks good to me. Check the packing and make sure that Marshal.SizeOf(typeof(Foo)) is correct.
|
# ? Dec 28, 2011 06:02 |
|
Sedro posted:That's a tough one. Unlike Java, CLR doesn't have any concept of a "soft reference," Sure does: System.WeakReference
|
# ? Dec 28, 2011 09:46 |
|
Dessert Rose posted:Sure does: System.WeakReference
|
# ? Dec 28, 2011 10:32 |
|
Soft reference vs. weak reference A soft reference is somewhere in between a strong and weak reference. Under low memory pressure, they function as strong references. Under high memory pressure, they function as weak references.
|
# ? Dec 28, 2011 19:08 |
|
I hadn't really considered weak references... I've only used them previously for event handlers, which wasn't much fun, but I'd forgotten that was because of the event handler part, not the weak part. Still, I don't think they're the right choice. From the documentation, and some discussion, it doesn't sound like a WeakReference will cause an object to survive a generation 0 garbage collection, in which case my cache would be getting flushed every few milliseconds. I may still give it a try though; my initialization is cheap enough that it still could be good enough. Currently, I'm thinking the best approach is to go with a cache collection that will use System.GC.WaitForFullGCApproach and then flush itself. Oh, except that won't work if concurrent garbage collection is available. Crap.
|
# ? Dec 28, 2011 19:16 |
|
I did some poking around for you. System.Web.Caching.Cache internally uses GlobalMemoryStatusEx and some magic numbers to determine the memory pressure. It might also be using timers. Hope you like getting your hands dirty. Edit: You can also use MemoryFailPoint to better predict and handle an out-of-memory situation. Sedro fucked around with this message at 20:39 on Dec 28, 2011 |
# ? Dec 28, 2011 19:27 |
|
Ok, I'm not sure how this binding scenario is supposed to work. Say my MainWindow has the following viewmodel code:
code:
code:
This is mostly just a braindump, it might not make sense, I can put together a (non)working example if necessary. epswing fucked around with this message at 20:39 on Dec 28, 2011 |
# ? Dec 28, 2011 20:37 |
|
You can use a constructor on UIPropertyMetadata to create an event when Items changes.
|
# ? Dec 28, 2011 20:46 |
|
Sedro posted:I did some poking around for you. System.Web.Caching.Cache internally uses GlobalMemoryStatusEx and some magic numbers to determine the memory pressure. It might also be using timers. Hope you like getting your hands dirty. Yeah, screw that, it's not worth that much trouble. Hmm, I could go with a hybrid of the weak reference and magic number approach - use a collection with a small, safe maximum capacity using strong references, evicting things to a collection with weak references. As long as they survive long enough to get promoted to at least generation 1, the weak objects should be reasonably long lived, and if they don't, I'm still not any worse off than just using the magic number. Sedro posted:Edit: You can also use MemoryFailPoint to better predict and handle an out-of-memory situation. Still not ideal, but it comes closer than anything else I've seen. I may well make use of that.
|
# ? Dec 29, 2011 00:47 |
|
Would it be beneficial for me to attempt to learn C# and ASP at the same time? With some Sharepoint integration thrown in?
|
# ? Dec 29, 2011 20:28 |
|
aBagorn posted:Would it be beneficial for me to attempt to learn C# and ASP at the same time? With some Sharepoint integration thrown in? I would recommend starting with C#, transitioning to ASP.NET (fairly quickly if you're comfortable with other languages). Get to know ASP.NET to a good degree before tackling SharePoint. While SharePoint is based on ASP.NET, it's much larger and complex. It also imposes a certain rigor on development that web developers seem to chafe against. Well, it doesn't exactly impose that rigor, but there's a lot of ways that you can really damage a SharePoint farm if you don't know what you're doing and stray away from best practices. If you want some insight into SharePoint development, start here for the developer's overview and in particular read Microsoft SharePoint Foundation as an ASP.NET Application and the sections in ASP.NET vs. SharePoint: How Development Differs.
|
# ? Dec 29, 2011 20:46 |
|
boo_radley posted:I would recommend starting with C#, transitioning to ASP.NET (fairly quickly if you're comfortable with other languages). Get to know ASP.NET to a good degree before tackling SharePoint. While SharePoint is based on ASP.NET, it's much larger and complex. It also imposes a certain rigor on development that web developers seem to chafe against. Well, it doesn't exactly impose that rigor, but there's a lot of ways that you can really damage a SharePoint farm if you don't know what you're doing and stray away from best practices. Thanks. This will be my first foray into developing, and I've put a little bit of time into C# already (a lot of basics at this point, still learning) and one of my bosses was talking with me about it and said that they would look into budgeting a new Jr Developer position in for me if I felt comfortable enough with the .NET Web side. Need to get out of help desk hell, you see.
|
# ? Dec 29, 2011 21:17 |
|
Zhentar posted:Still not ideal, but it comes closer than anything else I've seen. I may well make use of that. You could always use finalizers to continually resurrect cache items when memory pressure is low.
|
# ? Dec 30, 2011 02:39 |
|
Is GhostDoc Pro really worth it? I've been using the free edition and it's been working fine, is it worth shelling out?
|
# ? Dec 31, 2011 22:52 |
|
Factor Mystic posted:Is it possible to have a gradient run the length of an arc in WPF? I have seen this post which seems to indicate a goofy amount of code is required (so, par for WPF) but the example is a geometry path, not an arc, so I'm hoping to find some trick that will make it simple. Anyone have a thought on this, from last page?
|
# ? Dec 31, 2011 22:57 |
|
GrumpyDoctor posted:I'm trying to marshal some of these from C#: Quoting myself because I think I found the problem: in the native struct, unioned_rep has an offset of eight bytes, not four. Is the best practice here to force the alignment on the native struct or the managed one?
|
# ? Jan 1, 2012 00:03 |
|
Are you compiling for x64? It's probably putting the union at word boundary. It should have a 4 byte offset on x86. If so, the fix is to use a named union instead of trying to use LayoutKind.Explicit to mimic an anonymous union: code:
|
# ? Jan 1, 2012 00:54 |
|
It's x86, and the problem is that it's a 4-byte offset, but the native structure has an 8-byte offset. I'm going to need to "manually" fiddle with the alignment on one of them, but I'm not sure which one is a better plan. e: here's MSDN on the structure alignment compiler switch: MSDN posted:When you specify this option, each structure member after the first is stored on either the size of the member type or n-byte boundaries (where n is 1, 2, 4, 8, or 16), whichever is smaller. raminasi fucked around with this message at 01:03 on Jan 1, 2012 |
# ? Jan 1, 2012 00:59 |
|
|
# ? Apr 24, 2024 22:33 |
|
GrumpyDoctor posted:It's x86, and the problem is that it's a 4-byte offset, but the native structure has an 8-byte offset. I'm going to need to "manually" fiddle with the alignment on one of them, but I'm not sure which one is a better plan. When I'm doing anything with persistent or shared binary data structures, I always am completely explicit about every single drat piece of code that reads or writes that data-structure. I never use "sizeof" or the like. I always give it exact byte offsets (and specify whether it's big-endian or little-endian!). It feels like that if it's a binary format that will persist or be transferred between modules, then it sure as heck needs to be carefully documented, and the code to read it has to be paranoid to avoid danger from malicious or malformed input.
|
# ? Jan 1, 2012 01:08 |