Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Zhentar
Sep 28, 2003

Brilliant Master Genius

thehandtruck posted:

Error 1 Value of type '1-dimensional array of Hotel.Form1.Customer' cannot be converted to '1-dimensional array of String' because 'Hotel.Form1.Customer' is not derived from 'String'.

Well, you didn't give us the definitions we need, but the error suggests it. I'm assuming printArrays looks something like this:
code:
Sub PrintArrays(Values() As String)
Now, the error message there is pretty clear; read through it carefully. You have an array of type Customer. You need an array of type String. The compiler does not know how to make a array of String from an array of Customer.

That means you either have to change PrintArrays so that it accepts an array of Customer, or you have to make build an array of String yourself to send to PrintArrays.

Adbot
ADBOT LOVES YOU

Zhentar
Sep 28, 2003

Brilliant Master Genius
That is what he meant. Or, more simply and about as effective, you can just call l.BeginUpdate before the loop, and l.EndUpdate afterwards. If you add items one at a time without doing that, the listview will do some redrawing automatically after every single one, adding a substantial amount of overhead.

Zhentar
Sep 28, 2003

Brilliant Master Genius
Maybe it would work better if you did feetString = feetTextBox.Text;?

Zhentar
Sep 28, 2003

Brilliant Master Genius
I've got an application set up as a single instance, using WindowsFormsApplicationBase.IsSingleInstance, so that if other programs try to launch it with command line arguments, the instance that is already running can receive those arguments. I'm running into two problems with this; it takes nearly 200ms to for a new instance to start, send its parameters to the main instance, and close, so if a program runs it 20 or 30 times in a row, it takes a while to finish (even though my app doesn't need to do more than half a second of actual work). Worse, if a program does those same 20 or 30 runs in parallel instead of sequentially, some of the new instances can starve and crash ("This single-instance application could not connect to the original instance").

Is there a better way to do this?

Zhentar
Sep 28, 2003

Brilliant Master Genius

Ugg boots posted:

I don't know about passing command line arguments over from the new instance to an old one, but to just enforce single instance you can create a Mutex with the name like "MyApplication" and check its out parameter to check if you were the first one to create it. If you were, you're the first instance and you need to flag that object with GC.KeepAlive so it stays around until your application closes. If you open the mutex and you weren't the first to create it, you know there's another process floating around out there already.

Seemed like too much trouble to go through all that, so I just made it a No-Instance app (e.g. Main doesn't do anything) and it still takes a bit over 100ms per instance. I'm going to assume that means no .NET program will start up fast enough to satisfy me here, so I'll do something lightweight in C++ to handle this instead.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Bellend Sebastian posted:

I've been working on writing a screen buffer, trying not to use Console.Clear() so i don't get that flickering that you get on refresh. I tried making it write over all the screen tiles, but that worked out slower, unsurprisingly, and without a method for clearing, it starts getting really patchy with remnants and artifacts kicking about after the character has moved.

If you keep an array of the previous frame, then you can compare it to the contents of the current frame, and only redraw/clear the tiles that have changed.

Zhentar
Sep 28, 2003

Brilliant Master Genius
Debug the executable with Windbg.

Zhentar
Sep 28, 2003

Brilliant Master Genius

TJChap2840 posted:

I was more asking if there was a known simple way. My way is technically correct, but because of the white background in my images, the result isn't correct.

You look at each pixel, see if it's white, if is, set the alpha for that pixel to the max.

Zhentar
Sep 28, 2003

Brilliant Master Genius
From my testing in the past, the fastest way to get data into Excel is by putting tab delimited data onto the clipboard and pasting it in. Unfortunately, it's also a fragile method that can easily fail silently with large amounts of data.

Zhentar
Sep 28, 2003

Brilliant Master Genius

TJChap2840 posted:

Is there anyway to make a WinForms application look consistent across all screen DPIs? My google-fu is coming up with a bunch of convoluted answers that only half work.

http://msdn.microsoft.com/en-us/library/ms229605.aspx

Zhentar
Sep 28, 2003

Brilliant Master Genius
Say I have a function prototype like so:
code:
public T CreateAndDoSomething<T>(Action<T> somethingDoer) where T : new()
Is it even possible for type inference to resolve T there?

Zhentar
Sep 28, 2003

Brilliant Master Genius

ljw1004 posted:

However, it has to infer T from the Action<T> argument. It will never infer from what you do with the return type.

Ah, yeah. I secretly meant "without an explicitly typed Action<T>".

I'm pretty sure the answer to this is no, but is there any way to make a generic constraint on a generic class, e.g. where T : List<>?

Zhentar
Sep 28, 2003

Brilliant Master Genius

Sedro posted:

Sure, but you'll need to at least have a type param for the item. .NET generics don't use type erasure. Here are a few things you can do, depending on what you want to accomplish. What do you want to accomplish?

That's about what I figured, thanks.

What I'm trying to accomplish is getting stronger typing with a collection class, while hopefully minimizing changes to my existing code and code duplication, and learning more about effectively using generics in the process.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Zhentar posted:

What I'm trying to accomplish is...

After poking around at it a bit more, I can actually explain the main problem I'm having.

I've got this:
code:
public class BaseClass
public class SubClass : BaseClass

public Container<SubClass> SomeFunctionOrOther()
And I need to be able to do both of these:
code:
Container<BaseClass> base = SomeFunctionOrOther();
Container<SubClass> sub = SomeFunctionOrOther();
The output of SomeFunctionOrOther must be stored as a object that inherits from a particular concrete class, so I can't use an interface with covariance.

Making it public Container<T> SomeFunctionOrOther<T>() sucks, in part because it can't infer the type so I'd have to add in the right type to hundreds of lines of code, and in part because generic constraints won't let me specify that T must be castable from SubClass. I'm also concerned that the complexity of my actual code may mean that I end up with a long list of types for some functions.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Sedro posted:

So a call to SomeFunctionOrOther<SubClass> can return a Container<SubClass>, Container<BaseClass> or Container<object>? That would be contravariance, and its possibility depends on what Container does.

I mean something like this:
code:
public class A
{
  public Container<BaseClass> base { get; set;}
}

public class B
{
  public Container<SubClass> sub { get; set; }
}

public Container<T> ContainerFullOfSubClass<T>()
{
  Container<T> retval = new Container<T>();
  retval.Add(new SubClass());
  return retval;
}

static void Main()
{
  A a = new A(); B b = new B();
  a.base = ContainerFullOfSubClass<BaseClass>();
  b.sub = ContainerFullOfSubClass<SubClass>();
}
But after thinking about it more, I'm starting to feel that it wouldn't actually be too much trouble to restructure things to allow using an interface and getting covariance.

Zhentar
Sep 28, 2003

Brilliant Master Genius

dwazegek posted:

And no amount of where constraints will get it to work, since the only way to guarantee that Container<T> will be able to take a SubClass is if SubClass (or a base class of SubClass) is T .

Yeah, that was my point. That is an actual functional requirement of my code anyway; I just can't express that with constraints.

Zhentar
Sep 28, 2003

Brilliant Master Genius
Seeking a file vs copying it to memory is probably not that big of a difference. The OS is loading the file to memory and reading from there anyway whether you tell it to or not.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Dietrich posted:

I'm just hoping Roslyn finally lets us do ILmerge type stuff with WPF applications. I hate releasing programs which are 1 mb of .exe and 30 mb of .dll's.

Roslyn is for the opposite end of what you think it is. They aren't giving access to let you mess with the compiler output, they're helping you mess with the input (i.e. your code). The biggest feature of Roslyn is that it gives you access to the Abstract Syntax Tree the compiler generates, which you can easily use for things like static analysis or refactoring.

The Roslyn CTP looks really cool, I'm going to have to try playing with it soon. Unfortunately, it's got the usual CTP constraints, and there's a pretty big list of unimplemented features (if it doesn't gracefully ignore them that will be problematic for most of the code I'd be interested in trying it on). Are there likely to be any updates to the CTP with further language feature support? And is there some legalese somewhere that defines what exactly re-distribution & production use mean?

As a side note, ASTs combined with Linq queries makes for an amazing combination.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Jethro posted:

It's not what I'd naively expect, but I read Fabulous Adventures in Coding often enough to know that this sort of thing can often act in ways contrary to a naive understanding.

If you read it just a little more often, you wouldn't be surprised at all.

Zhentar
Sep 28, 2003

Brilliant Master Genius

ljw1004 posted:

The C#/VB team had to wrestle with how to represent the parse-tree of source code in the editor window to plugins. The problem is that the parse-tree can be modified by the user, and the plug-ins run in different threads or processes. The solution we used was to give each plugin an immutable snapshot of the tree at a given moment in time. (Obviously, when the plugin wants to call an API to update the source code, it needs to check whether its snapshot is still current.)

Do you run into memory usage problems with that? I'm sure your trees are usually much smaller than mine (I've been working to get 15 million nodes to use less than 1GB), but keeping around more than one copy could inflate that pretty quickly.


Speaking of which, are there any List<T> alternatives (aside from just a simple array, of course) with lower memory overhead? 32 bytes per List<> can really add up...

Zhentar
Sep 28, 2003

Brilliant Master Genius

Mr. Crow posted:

Basically I have child : parent<T> and I want T.

Can you modify the parent class? You could just add a method to Parent<T> that returns typeof(T).

Zhentar
Sep 28, 2003

Brilliant Master Genius

genki posted:

I know this is possible but unfortunately I can't look up/test the specifics atm.

It's not possible to cast parent to Parent<ICommon> unless T is ICommon. You can only use covariance with interfaces, and even then you shouldn't be able to make something with an Add function covariant.

I don't know if there's a way to cast it to Parent<T>, but it definitely sounds to me like the kind of question you shouldn't need to ask in the first place.

Zhentar
Sep 28, 2003

Brilliant Master Genius
From my experience, VS will run out of memory and crash on VSPs 1/5th that size. But even if it does manage to process that much successfully and maintain reasonable performance, that's probably too much data to reasonably interpret and understand.

Zhentar
Sep 28, 2003

Brilliant Master Genius

wellwhoopdedooo posted:

The other answers you got are fairly lovely, since they don't explain why you should or shouldn't override .GetHashCode().

You still didn't explain why! Though you at least demonstrated it.

Even if your own code never calls .GetHashCode(), infrastructure (such as the Dictionary class) does call it, and depends upon hash code equality being equivalent to object equality.

Zhentar
Sep 28, 2003

Brilliant Master Genius
Is there a good way to detect memory pressure conditions? Part of my application can end up creating a very large number of objects that can be re-generated with pretty low overhead. I'd like to come to some middle ground between crashing when I could easily free a gigabyte of memory, and always having to aggressive dump things needlessly to avoid crashing in rare cases.

Zhentar
Sep 28, 2003

Brilliant Master Genius
I hadn't really considered weak references... I've only used them previously for event handlers, which wasn't much fun, but I'd forgotten that was because of the event handler part, not the weak part. Still, I don't think they're the right choice. From the documentation, and some discussion, it doesn't sound like a WeakReference will cause an object to survive a generation 0 garbage collection, in which case my cache would be getting flushed every few milliseconds. I may still give it a try though; my initialization is cheap enough that it still could be good enough.

Currently, I'm thinking the best approach is to go with a cache collection that will use System.GC.WaitForFullGCApproach and then flush itself. Oh, except that won't work if concurrent garbage collection is available. Crap.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Sedro posted:

I did some poking around for you. System.Web.Caching.Cache internally uses GlobalMemoryStatusEx and some magic numbers to determine the memory pressure. It might also be using timers. Hope you like getting your hands dirty.

Yeah, screw that, it's not worth that much trouble.

Hmm, I could go with a hybrid of the weak reference and magic number approach - use a collection with a small, safe maximum capacity using strong references, evicting things to a collection with weak references. As long as they survive long enough to get promoted to at least generation 1, the weak objects should be reasonably long lived, and if they don't, I'm still not any worse off than just using the magic number.


Sedro posted:

Edit: You can also use MemoryFailPoint to better predict and handle an out-of-memory situation.

Still not ideal, but it comes closer than anything else I've seen. I may well make use of that.

Zhentar
Sep 28, 2003

Brilliant Master Genius
.NET 4.5 is the same in that it targets the same MSIL. But it isn't purely compiler stunts and syntactic sugar (and neither was .NET 3/3.5). There are API additions and behavior changes.

What makes .NET 4.5 "special" is that it's going to pretend to be .NET 4.0, right down to the build number. This is supposed to be totally okay because they're testing hundreds of applications for compatibility, which will ensure that nothing is broken by behavior changes (excepting of course the breaking changes they made on purpose), and hey, it's not like you were going to check if .NET 4.5 was installed before running your .NET 4.5 dependent application anyway, right?


As far as why they're doing this, that's easy. They're shaving precious megabytes off the Windows 8 install footprint to try to cram it onto devices with only 16GB of flash.

Zhentar
Sep 28, 2003

Brilliant Master Genius

boo_radley posted:

This is pretty neat (from Hidden Features of C#?:

That people would consider many of those features "hidden" saddens me. yield? Seriously?

Zhentar
Sep 28, 2003

Brilliant Master Genius
I wouldn't really expect it to be that slow for such a small data set, but you're repeatedly searching over the entire data set. It would be a lot faster to do a first pass to pull them out into a dictionary or similar by parent ID.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Dromio posted:

Now I've pointed the routine at our production data set (20000 rows) and it's still way too slow:

code:
[Debug] '' 2/1/2012 10:12:18 AM Start
[Debug] '' 2/1/2012 10:12:21 AM Fetched 19550 items for tree
[Debug] '' 2/1/2012 10:12:21 AM Building dictionary
[Debug] '' 2/1/2012 10:17:58 AM Done building dictionary
...
[Debug] '' 2/1/2012 10:28:38 AM Done building hierarchy

What the hell are you running on, a 286? For a mere 20,000 rows, I'd expect that dictionary code to be running in a fraction of a second... your best option is to profile, if you can. I don't think the real problem is contained in the code you've posted.

Zhentar
Sep 28, 2003

Brilliant Master Genius
You can also just add a column for the salt.

For other recommendations, it's not a bad idea to bump the hash up to SHA-256, and then hash the hash ten thousand times in a row.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Dietrich posted:

Using an individual, pseudo-random and obvious salt for users means that a hacker will need to compile their own rainbow table in order to crack any given user's password.

If you're working with a unique salt, there's no point to a rainbow table. You just brute force it straight up.

Dietrich posted:

Using an individual, pseudo-random and hidden salt for users means that a hacker will need to compile several rainbow tables in order to crack any given user's password. The number of rainbow tables required could range anywhere from one to five hundred depending on the complexity of the salt generator. You don't have the slightest clue how many it will take, and you'll have to check each table for a hash match and try to use it on the target system to validate it. Every failed attempt can be used to potentially identify the fact that you are trying to hack the system.

I don't need to touch the target system to figure out if I've figured out the right salt. I can just take a list of the top 100 (poo poo, even just the top 10) most common passwords, test it against every user in the database. If I don't get a significant number of hash matches, I haven't figured out the salt yet.

Dietrich posted:

You're a hacker. You've got several users tables to try to crack. They've each got a govpalin@yahoo.com user on them. One has a salt listed. The other does not. Which one do you crack?

A)I probably would've stopped after I got the first users table...
and
B)Both. It's not like this poo poo is hard, why not?

Zhentar
Sep 28, 2003

Brilliant Master Genius

The Gripper posted:

At least that way if the database is compromised an attacker would need to determine the salt from known data, which would be trivial if we had used a single immutable field (e.g. attacker creates an account, dumps the database to find his hash, tries combinations of id+password, password+id, timestamp+password, password+timestamp until he finds which immutable field is the salt).

My god, there might be dozens or even hundreds of possible combinations! And I can only test 50 million hashes/second on this machine. Christ, I hope I don't get too bored waiting for the answer.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Dietrich posted:

Brute forcing is pretty trivial to prevent with a e-mail to their registered account they must get to unlock their account after a fixed number of failed login attempts. If they've already compromised the target's email account then they can just reset their password anyway.

A "Rainbow Table" is a pre-computed database that allows efficient look-up of the hash input if it contains a given hash output. The net process of building the database and then looking up a single value in is not efficient; it is the "pre-computed" part that makes it worthwhile. A unique salt prevents the pre-computed part. If there is a unique salt, then you just use the good, old fashioned trial and error method.


Dietrich posted:

The point isn't just to make it harder, the point is to make it more time consuming as well. The longer it takes for them to crack it, the longer you have as a responsible admin to discover the security breach and notify your users that their passwords may have been compromised.

The point is that you haven't made it significantly more time consuming. You've made yourself feel better by trying to pull some little trick in the naive hope that it will take a hacker longer than 15 minutes to figure out what you've done.

Meanwhile...
code:
		public static byte[] TimeConsumingHashFunction(byte[] input)
		{
			HashAlgorithm hash = HashAlgorithm.Create("SHA256");
			byte[] output = hash.ComputeHash(input);
			for (int x = 1; x < 50000; x++)
			{
				output = hash.ComputeHash(output);
			}
			return output;
		}
I've just made figuring out each password take 50,000 times longer. No tricks, no trying to sneak away extra secrets and then hoping the hackers won't be clever. It will take just as long to calculate each hash whether or not they have my source code. Unless I made some stupid mistake in there, which is usually what happens when you try to write your own security code.

It still wouldn't be enough to protect against people using a password in a common passwords dictionary, but there's not really anything you can do about that (well, maybe you could load up your own common password dictionary and rejected any password in it...)

Zhentar
Sep 28, 2003

Brilliant Master Genius
My research concluded the best option was to print an RTF document to a PDF "printer".

Zhentar
Sep 28, 2003

Brilliant Master Genius

GrumpyDoctor posted:

What keywords should I be searching for?

Throbber.

Zhentar
Sep 28, 2003

Brilliant Master Genius
Option #4: Use a database.

Zhentar
Sep 28, 2003

Brilliant Master Genius

This Post Sucks posted:

With the error of: "CS0103: The name 'Request' does not exist in the current context". From everything I've seen, Request should be in System.Web, right?

a) It's in System.Web.HttpContext
b) It's not static. You have to call it from an actual HttpContext object.

This Post Sucks posted:

code:
    someVariable =  HttpContext.Current.Request.Cookies["myCookies"]["data"];
I'm getting an error of "System.NullReferenceException: Object reference not set to an instance of an object." which seems to me that the "[data]" part of the cookie hasn't been initialized yet, thus just doesn't exist. Is there anyway in .Net to find this out so I can then set it?

Even if accessing an uninitialized item weren't allowed (it is), it wouldn't throw a NullReferenceException. As MSDN says, it is thrown when you dereference a null object. Presumably Cookies["myCookies"] is returning null.

Adbot
ADBOT LOVES YOU

Zhentar
Sep 28, 2003

Brilliant Master Genius

Mr.Hotkeys posted:

Other than readability is there any advantage to this?

Compile time type checking. If people isn't an IEnumerable<person>, the foreach( x in y) will throw a cast exception at run time, while the .ForEach(action) will raise a compile error about the overload resolution (assuming, of course, there's not an action overload that matches what people does contain).

  • Locked thread