Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Inverness
Feb 4, 2009

Fully configurable personal assistant.

PnP Bios posted:

I did a writeup on how reflection can be useful for building a game editor for your .NET game. Let me know what you think?

http://unicorn21.tumblr.com/post/1641105473/a-new-detailed-writeup-on-using-reflection-in-your
That is a good guide. :) It is simple, to the point, and easy to understand.

The engine I'm working on is a hybrid of Stackless Python and C++ though, so I don't have that problem since I'm using wxPython to create my editor (which is awesome). But that information will be useful if I ever decide to work on a C# project.

By the way, is anyone here familiar with RakNet? I've just checked it out and I'm really impressed at the features it offers and how easy it is to set everything up. It's exactly what I wanted since I didn't want to have to use TCP or make my own solution for reliable variable replication. It's also extremely easy to set up and integrate.

Adbot
ADBOT LOVES YOU

Inverness
Feb 4, 2009

Fully configurable personal assistant.

Otto Skorzeny posted:

Source is an abomination for many, many reasons
Oh, that sounds interesting. Could you elaborate a bit more on that?

Inverness
Feb 4, 2009

Fully configurable personal assistant.

Vino posted:

Source was built out of Half-Life which was built out of Quake. Therefore it carries all of the old engine structure paradigms around with it. For example it's the only AAA engine I can think of that still uses BSP as its main scene management. It's tools are poorly supported and under-featured (Hammer only recently included a crappy lighting preview) and Valve provides little or no documentation other than the wiki which is not really enough.

I've said it many times but here it is again: Source works great for Valve and what Valve does but nobody else should be using it. The only people who should use Source in my opinion are Valve and TF2/L4D modders.
I've been wondering about the Source engine architecture for quite awhile. There are the separate client and server modules and separate but similar classes for the client and the server. It's quite different from how the Unreal engine works which is something I understand a bit better. Basically I've been wanting to know why the client and server code is separate like that and how it behaves in a single-player game compared to multi-player, etc.

Inverness
Feb 4, 2009

Fully configurable personal assistant.

haveblue posted:

As I understand it Unreal works at the level of individual variables. Object instances are identical on the server and all clients (with the server authoritative) and certain members are tagged as replicated, which causes them to be asynchronously kept up to date with the values of their remote equivalents. You can request different priorities, interpolation types, etc for each of them. The core network layer marshals all the pending updates into bundles, invisibly.
This is correct. I really like how Unreal enables you to create conditions for replicating certain variables rather than just doing a blanket approach. After doing some looking, I was able to find some C++ code from UT2k4's Onslaught package (which isn't leaked) that demonstrates how they implement this in their code. Source: http://koders.com/cpp/fid212DA5370827D1C04D75E7B668BCD4FF3BACC209.aspx

Just from looking at that I was able to figure out how it was working and create almost identical functionality for my own game, and it took much less time than I thought. Even better, I was able to easily integrate it with RakNet's Replica Manager plugin. At first I wondered if I was maybe overdoing it a bit, but making it has been an interesting experience, and the code is open source if anyone wants to take a look or use it themselves.

Inverness
Feb 4, 2009

Fully configurable personal assistant.

Vino posted:

That Unreal code didn't look nearly as legible. What the hell is NEQ and GetOptimizedReqList() ?
NEQ means not-equal, its a function used to compare the current value and the value recently sent over the network.

GetOptimizedRepList() ("get optimized replication list") is the function that is called in order to figure out which variables need to be replicated based on a set of conditions and whether or not their values have changed. The 'Recent' argument is a pointer to a copy of the object which is casted so that the current value can be compared with the recently sent value. 'Ptr' is a pointer to an array used to store indices. If a variable needs to be replicated, then its unique index is added to the array and the pointer position is advanced by one. DOREP is the macro that does all that.

If you're familiar with the replication statements in UnrealScript, GetOptimizedRepList() is the C++ version of that. It's called optimized because it is much faster than the UnrealScript equivalent.

Overall I think it is a really nice way to do things. It's very flexible.

The code you posted from Source doesn't look like it has the same purpose as that.

Inverness fucked around with this message at 17:27 on Dec 7, 2010

Inverness
Feb 4, 2009

Fully configurable personal assistant.

horse mans posted:

I'm starting to tackle a refactor towards composition and component systems in my engine (2D sprite-based), and I'm curious about how other people are doing this.

Before, I used inheritance heavily, so a Character object inherited Movable, Collideable, Updateable, and then mingled all these things together in the subclass to create the specific behavior for that thing. It's a nightmare that I'm trying to untangle now, but I'm not sure how to handle interaction between components and what components I should have.

I now have a MapObject component which provides behavior related to geometry (similar to the Translation component in Unity), a Movable component which applies movement vectors, checks against velocity, etc; a Geometry component which is meant to be able to tell when objects may collide with each other or the map. So if my character moves, I have to speak to each component in turn before I can finally perform the move. I need to make sure that the move won't collide with anything, I need to get the existing position, I need to use the Movable behavior to update the position based on velocity, and if I were going to go even further, the max velocity of the character might be based on its RPG attributes, so the Movable component would need to find that out.

I don't want to make components dependent on each other since that just undoes all the work of separating them. I also want the component behavior to be unit testable, which, admittedly I could do with simpler implementations of its dependent components. I've looked into message passing, but there still would need to be a basic tree of component dependencies:

  • Character sends message to move
  • Message goes to MapObject component to get current position
  • MapObject component sends message and position to Geometry component
  • Geometry component sees if Character will collide with world, if so, stops propagating the message, if not, sends it on to Movable
  • Movable component checks RPG Attribute to see current max velocity, compares against current velocity
  • Movable component applies increase in velocity against current position, sends new position to MapObject
  • MapObject updates current position with new position, sends message to rendering component
  • Rendering component updates spritesheet, notifies Camera to redraw
  • Eventually someone should let the Character know that its message was handled successfully or not.

So each fundamental action would have to have a notion of the chain of permissions and messages required to accomplish it, which seems like a ton of overhead.
I've also used a component system for my own project, but I found that I didn't like the typical ideas people had of them. Having purely message based components that treat objects as black boxes where components themselves only contain data didn't seem like it would perform very well. Your list sounds like one of those situations I avoided.

I'm using C# and went with a simple object oriented approach:

  • Actors are the basic level objects that are composed with components.
  • When a component is added to an actor, it binds to all components of that actor that it depends on. For example, a component that moves an actor to another level when touched depends on a collider component of any type. It then hooks up to the C# events provided by that component and listens for the touch event. These bindings can be optional if necessary.
  • Because components use other components within the same actor directly, I decided to make it impossible to remove components from actors. This simplifies the code since you can guarantee that components attached to the same actor will always be valid for the lifetime of that component. This also simplifies networking since the order of a component never changes within its actor, and therefore neither do the order of variables. Components are not replicated individually.
  • Each game level contains two additional "scenes" consisting of colliders and renderers. Those are basically lists kept up to date each collision and rendering components so that every actor doesn't need to be checked for each.

Basic movement is done by setting the velocity property of a movement component.
  • During every level update, the movement delta is calculated.
  • Level.MoveActor() is called with the current component's actor and movement delta. This method performs a collision check for all colliders attached to that actor against others in the level, and either applies the delta to the actor's position or not. It also dispatches any collision events.

The movement never actually interacts with any rendering code. That is handled by a separate system which finds all visible renderer components and provides them with an interface that they can use to draw sprites. A sprite component would only draw one, while an animation would draw several.

Here is the aforementioned level link behavior: http://pastebin.com/xWmA5a5t

I also construct my actors by declaring their components and properties in archetypes. These are merged with whatever is specified at spawn time either by code or by an individual object in the level editor.

code:
"ActorArchetype Fireball": {
    "Components": "Movement, Sprite, Box",
    "MovementIsLimited": true,
    "Light": true,
    "Color": "192, 64, 64, 96",
    "CollideActors": true,
    "CollideTiles": true,
    "BlockActors": false,
    "Texture": "Textures/Light02",
    "RenderSize": "4, 4",
    "ColliderSize": "1.5, 1.5"
}
Hopefully this is useful to you in some way. I just ended up not liking the pure blackbox component model. And I never saw any problem with having some components depend on others to work, since it still allows me to compose them as needed.

Inverness
Feb 4, 2009

Fully configurable personal assistant.

Sagacity posted:

Sure, that would be inefficient. But you can just cache that information. This is where messaging would be appropriate (i.e. "a new entity has arrived, it has these components: x, y, z").
This. In the case of C# you can just have events that are invoked when components or actors are created or destroyed. Interested systems will handle these and do what they wish.

Efficiency is much less of a problem if the action is infrequent. You need to be careful to avoid premature optimization. Collision checking and rendering are both frequent so in my case there is a list of both types of components for each map. If I want to do something different like damaging all actors within an area that intersected with my sword swing, then I would be checking each of the actors in that area with health components:
code:
foreach (Actor target in Level.GetActorsInBounds(GetWeaponDamageBounds(weapon)))
{
    if (target == Actor)
        continue; // don't hit yourself
    var health = target.Components.Get<HealthBehavior>();
    if (health != null)
        health.Damage(damage, Actor, DamageType.Physical);
}
Performance isn't a problem here.

Just remember that premature optimization is the enemy. :devil:

Inverness fucked around with this message at 23:46 on Jun 19, 2014

Inverness
Feb 4, 2009

Fully configurable personal assistant.

The King of Swag posted:

I'd like to know how you feel about this hybrid style, and I'd also like to hear what others think of the different proposed styles.
Regarding the unit types, it seems like it would be best to take the approach that C# does with types like TimeSpan. The actual value is stored as high resolution ticks. Conversions to things like milliseconds, seconds, or days is done on access.

Rather than multiplying units by a floating point, have a static method to create them.

code:
Volume gallonsLeft = Volume.FromGallons(1f);	// 1 gallon
gallonsLeft += Volume.FromGallons(1f);		// 2 gallons
gallonsLeft += Volume.FromOunces(64f);		// 2.5 gallons
gallonsLeft -= Volume.FromMilliliters(500f);	// 2.37 gallons
float dramsLeft = gallonsLeft.Drams		//

MethodThatExpectsFloat(gallonsLeft.Gallons);	// Gets 2.37
MethodThatExpectsFloat(dramsLeft);			// Gets 2424.74;
This keeps it consistent with how TimeSpan works. Don't have any implicit conversions since that will make the meaning of your numbers unclear. Doing multiplication or division against doubles or float seems like it would be fine, but not addition or subtraction since the meaning of the number being added or subtracted would not be clear.

code:
Volume g = Volume.FromGallons(2f);	// 2 gallons
g *= 2f;				// 4 gallons
g += Volume.FromGallons(1f);		// 5 gallons

Inverness fucked around with this message at 03:24 on Jul 7, 2014

Inverness
Feb 4, 2009

Fully configurable personal assistant.

OneEightHundred posted:

Keep in mind that part of why TimeSpan works the way it does is because its underlying type is essentially favored by the operating system, and its smallest representation divides evenly into every larger representation, so you never have to worry about getting subtle rounding errors from feeding a value to it and then reading it back out the way you would with most units of measure.

This makes a huge difference because if you're expecting something like this to work...
code:
Mass grainMass = new Mass(7000.0f, UnitOfMass.Grain);
Mass poundMass = new Mass(1.0f, UnitOfMass.Pound);

grainMass == poundMass ? Log("We're equal!") : Log("We're different!");
... then you're going to find out that equality is rare when you're converting values both ways with floating point scale factors.


Overall, you should probably adopt preferred units of measure anyway because they'll simplify conversion a lot and make it easy to make assumptions about what raw numbers mean.
It was mentioned previously that everything was defined in terms of SI units anyways. With this it would make sense to have the internal value based on this with properties or methods doing conversions to the desired unit.

In this case, if mass stored as milligrams internally, then 1 pound would get you 453592.37 mg and 7000 grains... 453592.37 mg (64.79891 mg times 7000). So they would be exactly equal there.

Inverness
Feb 4, 2009

Fully configurable personal assistant.

Jewel posted:

That's why you're supposed to always compare with a small error margin with floats? Even resharper gives you a warning if you just use "x == y" and offers to autoreplace it with "Math.Abs(x - y) <= TOLERANCE" (where TOLERANCE is a custom value like 0.001)

Edit: Yeah as above said, that was my reasoning too. It's for a game, it's not for down to the precise thousanth of a grain. If it was for scientific purposes you'd use either fixed point or some custom longer float/fixed storage type.
It seems like the best way to handle this would be to still have the default Equals() and equality operator use an exact comparison, but then have an overloaded Equals() method allow the user to specify a tolerance for the comparison.

a.Equals(b, 0.001) for manual way to do it along with a.NearlyEquals(b) that uses a common tolerance.

The King of Swag posted:

The equality comparison for all types must already be overloaded to comply with IEquatable<T> (all types implement IEquatable<T>, IComparable<T>; they'd be almost useless with keyed or sorted containers without those), and since I'm using Unity, I do the comparison between values with Mathf.Approximately. So far it seems to work great. A standardized unit is already stored internally, and converted to the format needed on request.

code:
Volume aGallon = Volume.Gallons(1f);	// Internally stores a value of 3.78541 L
Which makes me want to note that I've discussed this with others (one of our resident programmers in TFR being particularly helpful), and the consensus is that implicit conversion to float is fine, as long as there's no magic number +/- allowed, and any invalid arithmetic throws an exception. But considering that so many here are so adverse to the implicit conversion, I'll drop it, even though in practice, I've found that the lack of it makes my tests more verbose for little gain. Which is to say the pretty much finalized design is:

code:
Length boxWidth = Length.Feet(2f);
boxWidth += 1f;					// Illegal; no magic numbers.
boxWidth += 1f * Length.Foot;			// No implicit conversion to float, makes this illegal. This still just looks strange to me anyway.
boxWidth += Length.Feet(1f);			// Legal; number is an explicit type.

boxWidth *= 2f;					// Legal, because multiplication of lengths by a scalar is a valid operation.
Area boxArea = boxWidth * Length.Inches(30f);		
Volume boxVolume = boxArea * Length.Inches(6f);
Length boxHeight = boxVolume / Area.SquareCentimeters(13935.456);

MethodThatExpectsFloat(boxHeight.InInches)	// 6 inches
Finally, for those curious about the total variety of supported units: Initial Supported Units

Before anyone goes about claiming that I went overboard on the total number of supported units; remember that when this is done, I'm planning on releasing this to the asset store, where a wider variety of units would be needed by users. That said, I'm almost frightened that barring Imperial units (which are not the same as US Customary units), I actually see a use for the vast majority of those, within my own projects.
I'm a sticker for consistency when it comes to APIs, so I would probably rename some of those properties and methods. Use Length.FromX() for the constructor methods, Length.OneX for the constants, and instance.X for accessing the value in that unit.

Also, multiplication with floating point numbers should work fine as it makes perfect sense. 1f * Length.Foot should give you a Length with a value of two feet. C# doesn't have a separate *= operator though anyways.

If boxWidth *= 2f is legal then boxWidth = boxWidth * 2f is what it will actually be doing since you will have needed to define the * operator to do that.

Inverness
Feb 4, 2009

Fully configurable personal assistant.
Really? RakNet source has always been free to download. You still have to pay if you want to use it commercially.

Inverness
Feb 4, 2009

Fully configurable personal assistant.

The King of Swag posted:

Two floats with the "same" value that were come across with different calculations, will more often than not, not actually equal each other. Making the default equality operator actually check for exact equality is almost never what you want, which would make feeding them into a collection a serious pain. The best solution would be having the default equality operator perform a tolerance check, and then have another ExactlyEquals() method that will perform an exact equality check.
I realize this, but the reason I suggested this is also be cause of consistency. The default equality implementation for value types including other floats or things like strings is always an exactly equal comparison. For case-insensitive string comparison or tolerant float comparison you have to specify how you want that handled. I think having the default equality comparison for your unit types be a tolerant one obscures what is actually happening.

The question arises as to what does a user do if they don't like the tolerance of the default equality comparison. People have different needs after all. That tolerance is also obscured from a casual glance at the code unless you go into the documentation to check what it is.

Maybe you could do something like specify the tolerance for the default comparisons statically or for a specific thread, but it still seems like it is introducing uncertainty. You're changing the behavior of the equality operator by having it be tolerant by default. It's something I specifically advise people against doing when it comes to languages that have operator overloading.

Inverness
Feb 4, 2009

Fully configurable personal assistant.

The King of Swag posted:

I'm going to put some serious consideration into this; probably mock something up and see how unwieldy it is to use compared to how it is now. I have the feeling (actually, I'm almost certain) that it'll make them unwieldy to use as keys in keyed collections, but might make other comparisons easier to make, if I allow for variable tolerances; either on a type or per-instance basis.

Something else I'm also considering that's tangentially related, is switching internally from floats to doubles. Doing more tests, I found that the precision of floats isn't that great when dealing at the far ends of the supported units (gigajoules vs microjoules), and while it remains equally important to use floating point numbers smartly*, using a double does allow for much greater precision at very large and very small values. Computationally, doubles don't have the performance drawbacks over floats that they used to (at least for .NET with modern hardware), so I'm not so much worried about that. The only real issue I can find with it, is that the user will likely have to cast the values to float in most situations, as Unity and games still stick to float like bees to honey.

* Don't keep performing calculations on the same value over and over, whenever you have access to the source value and can perform a fresh calculation using it instead. This prevents error creep over time.
I tried storing the values internally as a long instead of a double. In a case like mass the long would represent the number of nanograms. This is just after my observation of TimeSpan and the assumption that it would help avoid the accumulation of errors.

This is a struct I made to test my ideas: http://pastebin.com/vLCfUg3q

It seems Mass.OneKilogram.Kilograms is not a clean 1.0 but 0.999 repeating due to the number of digits involved with that property. :saddowns: Not sure how much of a problem that would be for a user. Seems like a double for internal storage would be better anyways as opposed to what is basically fixed point math.

Since you're doing this specifically with Unity which prefers floats over doubles, you could have your properties provide floats by default with another for doubles. so m.Pounds would return a float and m.PoundsAsDouble would return a double. Alternatively if you want to focus outside of Unity, m.Pounds for double and m.PoundsAsSingle for float.

Inverness fucked around with this message at 22:48 on Jul 7, 2014

Inverness
Feb 4, 2009

Fully configurable personal assistant.
C# also offers the 128-bit decimal type. Never used it and no idea how it performs. It's mostly intended for financial calculations it seems.

Inverness
Feb 4, 2009

Fully configurable personal assistant.

The King of Swag posted:

Just to comment again on the floating point issues; I just finished writing (and verifying it works with tests) a fuzzy comparator and hashable fuzzy comparator. Following Inverness's suggestion, I made the different types perform regular equality checks instead of fuzzy equality checks. If you explicitly want fuzzy equality or comparison, you can do this:

*snip*
I don't like the approach you're taking with the syntax here. It seems very odd.

When you want to customize the hashing behavior of a dictionary, you should provide the dictionary a custom IEqualityComparer<T> implementation. Dictionaries and other properly designed hashing collections use an instance of this and can take one as a constructor argument. When you don't provide one, C# creates a default equality comparer that calls the GetHashCode() and Equals() methods of the instance object. A custom equality comparer need not even touch these methods. Default equality comparer types exist for things like case insensitive string comparison and such.

Your other example:
code:
myLength.Fuzzy(customError, customULPTolerance) == otherLength.Fuzzy(customError, customULPTolerance)
Is also odd since it seems like you're making the code more verbose just to hold onto the ability to use the equality operator. By convention, custom equality checking using an instance is done with an overload of Equals() like I suggested before:
code:
myLength.Equals(otherLength, customError, customULPTolerance)
I'm not sure about a case where you need fuzzy less-than or greater-than comparison.

Both of these will fit in with the conventions used by the .NET Framework and other libraries.

If you decide to do it this way, you would construct your custom equality comparer instance with the customError and customULPTolerance arguments that will be used when calling Equals() on the instance.
code:
    public class FuzzyEqualityComparer : IEqualityComparer<MyType>
    {
        private readonly double _customError;
        private readonly double _tolerance;

        public FuzzyEqualityComparer(double customError, double tolerance)
        {
            _customError = customError;
            _tolerance = tolerance;
        }

        public bool Equals(MyType x, MyType y)
        {
            return x.Equals(y, _customError, _tolerance);
        }

        public int GetHashCode(MyType obj)
        {
            return obj.GetHashCode(fuzzyHashArguments);
        }
    }
And to use that...
code:
var fuzzyDict = new Dictionary<MyType, OtherType>(new FuzzyEqualityComparer(something, something));
If you created fuzzy equality comparers for floating point types or your own types that stored a floating point type internally, then you could use it elsewhere too and have it actually be properly compatible with existing code.

Inverness fucked around with this message at 00:05 on Jul 12, 2014

Inverness
Feb 4, 2009

Fully configurable personal assistant.
It seems Unreal Engine 4 already has preview notes for 4.4.

They're just wrecking the competition. :allears: One of the advantages Unity had over UE4, 2D support, is a gap being closed by the Paper2D extension.

Inverness
Feb 4, 2009

Fully configurable personal assistant.
It doesn't help that Unity chose to implement their own version of Mono instead of licensing the existing one. Which means tremendous amounts of work are going to go into reinventing the wheel in order to keep their version of Mono up to date with C#. Those are resources that could have been spent improving Unity itself. Considering the pace Unity is at now, I expect the gap between their Mono version and the latest C# to widen and complaints to increase as people lose the ability to have the same source run on all of Unity, .NET, and Mono.

UE4 removed UnrealScript because they finally got to the point where the cost of having to implement things in UnrealScript with all of the interop overhead, lesser debugging features, and weaker IDE tools outweighed the benefit of avoiding C++ for higher level code.

Jo posted:

I'm in kinda the same boat. The right way (or any way) to do stuff in Unreal feels masked behind a bunch of boilerplate. Maybe the Unity style of "everything is a surface-level component" leads to obtuse or bad design in larger projects, but boy does it help early on.
Can you elaborate on what you mean by stuff feeling masked behind a bunch of boilerplate?

Inverness
Feb 4, 2009

Fully configurable personal assistant.

xgalaxy posted:

This is exactly the thing I'm battling now with my posts in another thread regarding SeviceStack. The stock library functions fine under Xamarin.iOS and Xamarin.Android but didn't work on Unity. Luckily this particular library was really easy to port. However, other libraries which utilize .NET 4+ that work perfectly fine on Xamarin are not portable to Unity without tremendous amounts of effort, especially if they make heavy use of new .NET features like tasks and async / await.

It seems like Unity is trying to get rid of the problem by developing their C# to native compiler. My biggest fear is that this will just make things even more incompatible, or worse, their native implementations exhibit inconsistent behavior with the "reference" implementation (Microsofts).

And why are they doing their native compiler when Microsoft is already working on this very thing. Just seems like they are falling into the same trap.
I forgot about their C# to native compiler, or I probably would have mentioned that too. Of all the dumb things to do. :bang: There's a point where it would just be better to clean up the source and make it, or at least an API available.

If they have the resources to sink into making a C# to native compiler or maintaining their own version of Mono they should drat well have the resources to pay up for the latest Mono version.

darkpool posted:

I've been at a bunch of Unity events this year, once I was at dinner with the management just after the UE4 source announcement and the discussion went something like this.

"Should we care about UE4 source being available?"
"Eh, who gives a poo poo?"

Some of their big partners are getting upset by this attitude too.
:stare: Maybe they don't realize that companies would pay Epic six digits for access to the full UE3 source. Getting UE4 source code for $20 is an incredible benefit.

You don't really need anymore evidence than this that Unity is on a downslide that will require a management change to get off of their asses.

Jo posted:

Let's take this example. I want to create an item which is visible, has physics, and can be picked up by the player. In Unreal I'd probably subclass Actor, right?

*snip*
You shouldn't need to do a raycast to check if its picked up, just check for an overlap collision with the sphere or whatever you're using. I don't remember how UE4 handles physics events so I can't recommend anything there. You wouldn't hold a reference to the player either, just give the item to whatever pawn collided with APickup.

Your unity example has you holding a permanent and then checking the distance between the player and the pickup for each pickup while ignoring any possible collision volume or mesh. Yet your UE4 example is doing it more properly by creating a sphere for collision and specifying the static mesh for the visual aspect. The two examples really aren't showing the same thing.

You're right that you have more code to implement that, but then C++ is more verbose and UE4 more powerful. It would only be something you'd need to implement once really before reusing the class.

roomforthetuna posted:

UE4's "flappy chicken" demo appears to be 28MB on Android. As some kind of person from medieval times this makes me sad - you could totally have played a game like that on a machine with 16K. Even allowing for higher resolution graphics and a reasonable amount of bullshit 3D interface wrangling surely we shouldn't be excusing this "everyone has infinite everything" approach to modern software development. We'll never get nice things if we keep making the same old poo poo take up the full hundreds-of-times expanded capacity of our computer-machines. Get off my lawn.
UE4's engine module is a beast. It's not surprising that they can't just strip all of that out. Flappy Chicken is also actually rendered in 3D with 2D sprites being shown and hidden based on blueprint code I'm not sure how much overhead that would include compared to something like Paper2D.

Edit:

dupersaurus posted:

In a competition of capability vs barrier to entry, Unity's still and likely will be top dog for awhile. We the regulars are often of a level in which Unity's deficiencies compared to UE4 might start to matter (and even then a lot of us are still using Unity); but to someone coming in new as a "making games sounds fun" person, or even an experienced hobbyist, how likely are you actually going to recommend UE4 over Unity? I can't imagine that's something that's going to change soon or very easily, no matter how much Unity might (a big might) screw things up.
That's true. Though I have to say, UE4's editor is a tremendous improvement over UE3's. It reminded me of Unity's when I first opened it. The blueprints feature is going a long way in making programming easier too by allowing people to avoid C++ entirely in many cases.

I haven't used Unity myself so I can't really make a real comparison.

Inverness fucked around with this message at 23:10 on Aug 4, 2014

Inverness
Feb 4, 2009

Fully configurable personal assistant.
It depends on what you mean by "not doing anything in the editor". If you mean minimizing it or not having it focused, yeah I'm sure there is room for optimization there if UE4 doesn't do it already. If you mean having it activate but just not moving your mouse doing any input then that will not help as much because, unlike your typical desktop application, the GUI is rendered by the engine at some high FPS and redrawn constantly just like anything in the game.

Inverness
Feb 4, 2009

Fully configurable personal assistant.

Jo posted:

I don't think I'm storing a reference to the player in the Unreal code, am I? I _am_ in the Unity code because it makes it faster to check the transform. In the UE code, if I gave the item to whatever pawn collided with APickup, is there a chance a non-player character could pickup the item, or are the pawns solely players?

As for the raycast, you're right. I shouldn't have said that. I do still have work to do on that code, though. It's not "complete" in the sense that I haven't overridden any collision overlap function or implemented the check to see if someone's inside.
Both players and AI characters can have pawns.

quote:

The Unity example is checking to see if the distance from the player is less than the pickup threshold, yes. I get that it means the player could theoretically pick up an item through a wall, but that fix is a raycast away. Wouldn't a sphere collider in the Unreal example also have the same problem of picking up things through walls? Why is a sphere collision 'more correct' than checking the distance?
Well you could also make the sphere bump the wall so it never intersects, in which case you can't pick it up through the wall. A sphere collision is more correct because you're letting the physics system handle the details when it comes to collision checking instead of manually performing checks every frame for a specific actor. You shouldn't be referencing the player directly like that. Instead you would check whatever pawn collided with the sphere to see if it is eligible for pickup. The existing physics system is also integrated with the editor, blueprinting, and whatever else uses it.

quote:

I think it's not an unfair comparison. I'm showing what it takes to have an object which is visible, has physics, and can be picked up (or have a pickup function be called) in Unity and in Unreal. The means by which they need to have this done are different, yes, but I think it qualifies well my argument that Unreal has more 'boilerplate' code to get things going. I can't just make a component and attach it to an object. I need to either make a new object, subclass an object (maybe rework my hierarchy), or fiddle with Unreal's component system (which I haven't tried yet). It's not so much that C++ is more verbose than C# (it is, but that's not really why the example is longer), it's because when I make an object in Unreal I need to define my Mesh object, my collision primitive, set the possibly overridable OnPickup function, and do all sorts of other things rather than add another behavior onto an object which has all these. The difference in length stems from (I'd estimate) 50% component versus inheritance and 50% C++ vs C#.

But that was my point, right? It takes longer to do stuff in Unreal because you have to write more.
Yeah it does take longer to do certain things in Unreal, but then, as I said, you're doing more. Unity both does a lot for you and limits what you can do, as far as I know. It's a tradeoff.

The reason I didn't feel it was a fair comparison is because both engines have existing systems for handling collision that you were not using. Raycasting every frame on a specific player reference seems like just about the worst way to do it. I'm sure the Unreal code would end up larger but it'd still make a better comparison if you were doing it properly for both.

Now then, I was just looking through the code for the newest ShooterGame example project for UE4, and found a pickup class that demonstrates just what we were talking about :
ShooterPickup.h
ShooterPickup.cpp
ShooterPickup_Ammo.h
ShooterPickup_Ammo.cpp

Inverness
Feb 4, 2009

Fully configurable personal assistant.

Jo posted:

That's true. I guess I could have an extra collider in Unity and use the OnCollisionEnter. I hadn't thought about that when I wrote it, but it would make the Unity code shorter. I'd just override the OnCollide or OnCollisionEnter. No need to raycast or any of that. That balances out the demo and Unity still ends up being less code. Again, this is just a side-effect of mostly component based versus inheritance based. That's what leads me to say Unreal Engine requires more boilerplate: to add functionality to what we'd think of as an in-game object, you need to subclass an object with the functionality, rather than simply making a component. I believe the latter leads to shorter code and faster development, but allows people to make bad decisions and doesn't punish lazy programming.
That's true. I think Epic realizes that components are a superior model, but the problem is they have way too much code to just flip over to using them. They've been slowly moving actor code into components as time goes on. There is a tremendous difference between UE3 and UE4 code when it comes to that.

But still, remember that you're making your PickupHandler MonoBehavior to add in your code that actually works with the components. That is equivalent to the subclassing in Unreal in regards to where your component-using logic is going. Making an actor subclass with logic that uses components attached to it isn't that much different from making a component subclass that uses components attached to a common actor. Though of course the actor subclass is more limited in how you use it compared to a component.

Inverness fucked around with this message at 03:42 on Aug 6, 2014

Inverness
Feb 4, 2009

Fully configurable personal assistant.

echinopsis posted:

Is there an issue using this preview version of UE4 4.4 to start to develop on? I mean I know I'm very new and it's very likely by the time I start making what I really want to I'll start fresh anyway and it might not be preview by then but regardless is it difficult to import from say 4.3 to 4.4 or whatever?
There shouldn't be enough changes from 4.3 to 4.4 that you would have any problems. There might be some API changes, but not major ones. I imagine that you would have to recompile if the binaries aren't compatible, but I can't even be sure on that. On the other hand I don't see the harm in starting with the 4.4 preview.

Inverness
Feb 4, 2009

Fully configurable personal assistant.
Unreal Engine 4.4 is released :yotj:

Only one month after 4.3.

Inverness
Feb 4, 2009

Fully configurable personal assistant.

Quiet_ posted:

I made a whole "game" with a hud and enemies, win/lose conditions, weapons etc etc in just blueprints. I could see getting into c++ nitty gritty for things as echinopsis says (accessing vertex info)

You also get to marvel at this stylish art work of your "I'll just throw this here" design!


It pains me to look at that since I know UE4 includes stuff for grouping blueprint nodes, creating macros, and commenting stuff.

Inverness
Feb 4, 2009

Fully configurable personal assistant.

Obsurveyor posted:

I think you're confused because they already have two different license models. What they said was customers don't really want a royalty-based one. If there is not enough demand then there's no business reason to spend the money for all the infrastructure that supporting and, more importantly, enforcing the royalty model would require.
I don't believe that. If they had asked suggested the idea of a royalty based license in exchange for source code access I assume many people would jump at the chance. They're just tone-deaf and likely convinced themselves that they knew what people wanted.

Inverness
Feb 4, 2009

Fully configurable personal assistant.
Here is a blog post about a team that switched from Unity to UE4 and feel much better about it.

The screenshots comparing their old Unity vs new UE4 build is like night and day. :eyepop: I doubt that's the best Unity can do so they most not have been comparing things evenly there.

http://martiancraft.com/blog/2014/08/an-unreal-decision/

I also noticed that the new Content Examples project for 4.4 includes a level demonstrating Paper2D features. One of the things I thought might be useful is a sprite animation tool that doesn't involve whole separate frames but instead defining sprite parts then animating them by changing their position over time. I started to make one in C# but never had the motivation to finish. None of the existing tools I found that might have enough functionality for me even supported defining a sprite based on a part of a texture instead of a whole one.

What I have in mind is pretty simple too. You define a set of sprites for an animation. Each sprite is defined as a rectangle from a source image. This sprite set could be exported or imported as necessary. For each frame in the animation you merely place sprites on a 2D plane. Things like custom properties could be defined per animation or per frame for things like when to play a sound or when to indicate damage should be trigger in an attack. Texture substitution would also be available in-editor so you could reuse the same animation with differently textured sprites as necessary.

Maybe one day I'll work up the motivation to finish it.

Inverness fucked around with this message at 14:43 on Aug 15, 2014

Inverness
Feb 4, 2009

Fully configurable personal assistant.

Sagacity posted:

You seem to be describing something like Spriter?
I looked at that briefly and it seemed really incomplete and didn't do much of any of the things I wanted, such as being able to use a sprite sheet.

xgalaxy posted:

Bone based animation for Paper2d is on their trello roadmap.
As far as editors for that stuff you should look into Spine.

Both Spriter and Spine will do what you want. I personally feel that Spine is better but you should try both.
I'm not paying for Spine for a project that isn't even commercial. I'm also not really looking for bone based animation. My game is top-down for one. I just need to be able to place sprites at arbitrary positions in a frame, no bones or anything. It would be used for animating anything from an explosion to a gate opening.

Inverness
Feb 4, 2009

Fully configurable personal assistant.
I'm also someone that fell in love with C# after moving away from C++. I don't think moving from C# to C++ in UE4 for development would be too much of a hurdle. I say this mostly because Epic seems to be good about structuring their code and providing the tools so you can get the most done with the fewest amount of lines possible. And yes the C++11 and C++14 features are a positive.

That's not even getting into Blueprints.

UE4 4.5 will also be introducing hot reloading for gameplay code. You'll be able to write all of the C++ you need and compile and load it into the game without ever restarting the editor. Of course that doesn't help if you crash the game. It's also odd for me to think about how that is even possible. I suppose its a unique advantage due to the nature of C++. You certainly can't reload assemblies in C# without placing them into a separate app domain that introduces marshaling overhead at the boundary.

On a C++ note, even though I rarely touch it nowadays, Microsoft's glacial pace when it comes to supporting the new standards irritates the gently caress out of me.

Inverness fucked around with this message at 04:58 on Aug 16, 2014

Inverness
Feb 4, 2009

Fully configurable personal assistant.

Shalinor posted:

If you crash your game in UE4 in editor, does it take the entire editor down with it, or is editor state independent / you can recover from a crash or an infinite loop without losing everything since the last time you saved?

(in Unity, if you hit an infinite loop, you're hosed, done - there's no separate process to End Task on, it's just flat dead)
The editor is fully integrated with the engine, if you crash things there or create an infinite loop you're hosed and have to restart. You can play test a game by launching it in a separate process, but most prefer to just play it in the editor because it's faster and easier.

Inverness
Feb 4, 2009

Fully configurable personal assistant.
They sure changed their tune pretty fast, though not with the parts that matter the most.

Inverness
Feb 4, 2009

Fully configurable personal assistant.
A recent Epic blog post included a link to an architectural demonstration with UE4:

https://www.youtube.com/watch?v=UwEuSxAEXPA

:sbahj:

Inverness
Feb 4, 2009

Fully configurable personal assistant.

lethal trash posted:

At my office we've been having so many problems using git as our primary source control for all assets. So now we're looking at alternatives, those being Perforce and Plastic SCM. I don't think we'll be going with Perforce, so that really only leaves Plastic SCM. I tried it out a little bit last weekend for the LD30 and I was pretty pleased. It was fast, the error messages were helpful and everything worked nicely. I am wondering if anyone else has used Plastic and if they can share their experiences? If you've used it in a production environment even better. I really want to know about the gotchas that come after longer use.
Why are you leaving out Perforce? It's free for up to 20 users. Do you have more than that?

Inverness
Feb 4, 2009

Fully configurable personal assistant.
I didn't even know SourceTree existed until now. I always just used TortoiseGit and TortoiseHg. I assume this is better?

Inverness
Feb 4, 2009

Fully configurable personal assistant.

Stick100 posted:

Yup you are 100% correct. If source control protocal doesn't give you exclusive checkout you're then no client will help. Does perforce allow exclusive checkout?
Yes, Perforce is a centralized system. If you have gigabytes or terabytes of files then it's what you want.

It's also compatible with Git.

Perforce is the name I see tossed around for companies that would need some high quality centralized version control, so I would probably go with it if git was unsuitable. UE4's integration is also a good reason if you use that.

Inverness
Feb 4, 2009

Fully configurable personal assistant.

echinopsis posted:

I'm thinking about Source Control purely as a backup (I am a 1 man thing) so based on this discussion I can't work out which is the easiest and best AND FREEEST for UE4
Git would work fine for that since you wont be passing huge repos around among a team.

Inverness
Feb 4, 2009

Fully configurable personal assistant.

echinopsis posted:

Thanks. I believe I need a plugin for UE4 right?

by Git do you mean GitHub? Or is that just a "brand" of Git? Or do you recommend another one?
Git is a distributed version control system. GitHub is a website for hosting Git repositories and providing features around that.

You don't need to use GitHub. You can create a Git repository and just keep it local without pushing it anywhere. You don't need to use GitHub or Bitbucket unless you want to back things up externally and share with others. GitHub does not actually allow you to create private repositories for free. Bitbucket does though.

Inverness
Feb 4, 2009

Fully configurable personal assistant.
Someone might have made a Git plugin for UE4.

Edit: Now that I read what you said in more detail. Can you elaborate on the trouble you had. You first should learn to use Git before you try to integrate it into your workflow. Just tossing it in is asking for trouble. I don't even know how UE4 handles version control.

If you tried to get a Git plugin without actually getting Git you probably had trouble.

Inverness
Feb 4, 2009

Fully configurable personal assistant.
Which line is the error message indicating?

Inverness
Feb 4, 2009

Fully configurable personal assistant.

pianoSpleen posted:

The only place it gets awkward is if there are specific instance traits that only certain ItemTypes can have (for example, a "Condition"/"Durability" trait for only armour). There are a few workarounds but they're all a bit clunky.
I'm using C# so a simple cast is sufficient. The same can be done in C++ with dynamic_cast.

If you have a limited amount of types and want to be more strict, using an enum for all possible types is fine.

Adbot
ADBOT LOVES YOU

Inverness
Feb 4, 2009

Fully configurable personal assistant.

dupersaurus posted:

If I'm making a tick manager in Unity, should I be using C# events, or roll my own with linked lists or something? Events are drat handy, but I don't know what sort of extra performance overhead (if any) is there with them, with dozens of listeners frequently coming in and out.
Multicast delegates are immutable arrays of function pointers. What you're doing when you add a handler to an event (which are syntactic sugar over multicast delegates) is creating a new object with all of the old delegates plus the new one. Every add or remove adds memory pressure, and removing delegates is costly since you have to find a match in the array. Invocation of a multicast delegate is fast though.

It's basically like adding and removing characters from a string. Whether this is okay for you depends on how frequently you're adding and removing objects. In general though I wouldn't use events because you have no control over ordering or anything else.

I think you're better off using a linked list for managing objects that a frequently added and removed but need to be iterated over. In my own game I add each actor to a linked list then store the LinkedListNode in the actor so I can easily remove the node on destruction without iteration.

In my case I manually iterated over the linked list by doing:
code:
LinkedListNode<Actor> actor = _actors.First;
while (actor != null)
{
	actor.Value.Update(elapsed);
	actor = actor.Next;
}
I did that to avoid allocating an enumerator for every level 60 times a second, but that was before I found out that .NET Framework collection enumerators use structs instead of classes, so that isn't really necessary.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply