28 Jan, 2011, Chris Bailey wrote in the 1st comment:
Votes: 0
So let us assume that we have a Player model with various properties that need to be stored in my database for persistence. Most are standard string or integer fields, but a couple of them are more complex objects. For instance, I have an Inventory object that each Player instance has. The Inventory object includes a data structure populated with any items the player may be carrying or wearing. The only thing I can think of to do is serialize the Inventory object and all of the Item objects in yaml or json, and dump them into a Text field in the DB. Is this is a worthwhile approach, or should I look into other options?


EDIT : My other idea was to just map relationships between the item and owner and store them in the traditional way. I wasn't sure which would be better? I don't work with databases very often!
28 Jan, 2011, plamzi wrote in the 2nd comment:
Votes: 0
Hi,

Of course, you can serialize and store inventory as a glob inside the Player table, but that defeats the whole purpose of having a database, if you ax me. Your "other idea" is the right way to go.

When you think about efficient db storage, you will very likely want to end up with a separate table for each structure. Don't be distracted by the fact that the inventory struct, once loaded up, lives under the player struct. In a database, you want to keep any struct that may be accessed separately in its own table, and you want to keep your tables as vertical as possible (minimize the number of columns).

An inventory is a perfect example–if you have a sensible player state-saving logic, the inventory will need to be updated separately, and you wouldn't want to dig through a very wide table to update just some of the columns.

With the above in mind, the player inventory will be its own table, with a player id column which you will use to join the table to any other tables as needed. Every item in the inventory will be its own row, with one of the entries being the player id this item belongs to.

But, in a relational db, which I'm assuming is what you're talking about, every row needs a unique id. That can't be the player id alone because a player can carry more than one item. My solution (there are tons of others that are probably just as good, if not better) was to compose the unique item id from the Player ID + Inventory Number. Then, I use the LIKE operator in MySQL to fetch any rows that match "Player ID + *". This gives me the saved items for a player, which I can then load onto them when they enter the game.

I also store the Inventory Number of each item in a separate column. When something in the inventory changes, I store the new items slot by slot, and then drop any existing rows with Inventory Number greater than the latest max inventory number.
28 Jan, 2011, Chris Bailey wrote in the 3rd comment:
Votes: 0
Got it. All fixed. Thanks =)
29 Jan, 2011, Runter wrote in the 4th comment:
Votes: 0
Dunno what you ended up with Chris, but I've been using datamapper pretty heavily. You're looking at something like this for the most simple example:

class Player
include DataMapper::Resource

property :id, Serial
# …

has n, :items
end

class Item
include DataMapper::Resource
property :id, Serial
belongs_to :player
end


but realistically you probably want:

class Player
include DataMapper::Resource

property :id, Serial
# …

has 1, :inventory
end

class Inventory
# …
end

class Item
#…
end
29 Jan, 2011, Chris Bailey wrote in the 5th comment:
Votes: 0
Your second example is along the lines of my first implementation. How would you go about storing item instances in the database using that example? That is what made me think I needed to serialize them. I figured I must just not be thinking about the problem from the right angle. :P
29 Jan, 2011, Runter wrote in the 6th comment:
Votes: 0
Well, here's a very simple example. Consider your model: a Player that has 1 Inventory. Inventory that has n Items.

class Player
include DataMapper::Resource

property :id, Serial
property :name, String
# …

# Query items short circuit.
def items; inventory.items; end

has 1, :inventory
end

class Inventory
include DataMapper::Resource
property :id, Serial
has n, items

belongs_to :player

def count(opts={}); items.count(opts); end # return the number of items in the inventory.
def heaviest; items.max(:weight); end # Maybe so you can have them drop the heaviest item?
def weight; items.sum(:weight); end # Total weight.
end

class Item
include DataMapper::Resource
property :id, Serial
property :tag, String
property :weight, Integer # just for an example field and the purpose of the Inventory container.
belongs_to :inventory
end
DM.finalize!

# somewhere after init of datamapper database and auto db migration.

# create a player
# query for the first player matching the query, otherwise instance and save it with the query parameters + arg2.
Retnur = Player.first_or_create({:name=>"Retnur"},
{:inventory=>Inventory.new()})

# now we have Retnur's model to access including an inventory.

# Create a new item and place it in the inventory.
Retnur.items.create {:tag=>"sword"}

# Query inventory and list all items by tag.
Retnur.items.all.each { |obj| puts obj.tag }

# By this point you would only see "sword"
# So destroy all Retnur's inventory. i.e. full table drop/purge
Retnur.items.destroy! # bang for no validations, it's faster if you don't need them/none defined.

# This was a naive example. More commonly you'd see something like this:

# Lookup item with vnum 10 place it in inventory.
Retnur.items << Item.first(:vnum=>10)
Retnur.save # persisted

# so to query all items.
Item.all

# to query only items Retnur has
Retnur.items.all

# and if Chris existed you could find intersections and all sorts of neat things easily.
(Chris.items.all - Retnur.items.all).each {|item| puts item.tag}

# For a final example, which is going to be useful to you, and to give you an understanding of the
# modification and lazy query aspect consider needing to purge only inactive player files.

players = Player.all # a query for all players..not yet acted on. I.e. free.

# query all players (cost now paid once) and modify the query in place if they are active.
players.each { |ch| players = players.all(:id.not=>ch.id) if player.active? }
# And there's probably cuter ways to modify that query.
#instead of
players = players.all(:id.not=>ch.id)

# maybe this works:
players -= ch

#query now only contains players inactive.
players.destroy! # cost paid for the second time.

# Compare it to this:

# cost paid first time.
Player.all.each {|ch| ch.destroy! if !ch.active? } # cost paid n times

# where n is the number of inactive players.
# If you have a large store of player files this would be highly inefficient (although it works).
# For this particular example, it probably won't matter that much in the long run.
# But it is fairly important to realize that building the query is not the same as executing it.

Player.all # cost never paid because no each call.
Player.all.all.all.all.all # cost never paid.
Player.all.all.all.all.all.each {} # cost only paid once.


Heh, heh. And no, I haven't tested or even ran this code in interpreter.
29 Jan, 2011, Runter wrote in the 7th comment:
Votes: 0
Oh, and something else to keep in mind. When you're using a database library like activerecord or datamapper you're basically agreeing to use readers and writers they define for instance variables. You generally won't ever use the instance variable direct access like you may otherwise do. If you do use the instance variable direct access the data A) may be more raw than the database type wrapper would make it and B) it won't consistantly even contain data since a lot of it is lazily loaded. C) changing the memory value may not result in it being persisted correctly.

Getting used to the dsl these libraries define is at first difficult but later you start appreciating all the work you didn't have to do.
29 Jan, 2011, quixadhal wrote in the 8th comment:
Votes: 0
Dunno about the code side of the universe, but an inventory for a player in SQL might look like:

create table players (
player_id integer primary key not null,

);

create table items (
item_id integer primary key not null,

);

create table inventory (
player_id integer references players (player_id),
item_id integer references items (item_id),
count integer
);


That assumes generic items. If you wanted items split into prototypes and instances (IE: so they could take damage, etc), you could add another table to hold the instances – or add an "instance_id" to the items table where NULL means prototype. In that case, the inventory table would store instance_id's instead.

This is the classic way to make arrays in SQL using junction tables.

I have NO idea if your fancy OO-API will even allow you to see that level of detail, but that's typically how you'd do it in database design. I tend to design things from the data end and then figure out code to work with that. Afterall, putting your data in SQL but making it hard to query with SQL tools is…. kinda pointless.

Good luck!
29 Jan, 2011, Runter wrote in the 9th comment:
Votes: 0
I'll just say that Chris was specifically asking for datamapper advice. Putting that aside, if your advice is to lawl at ORMs in general and suggest it's superior just to write sql the good old fashioned way, please allow me to make the counter argument. For 99% of applications written with datamapper (or other "fancy OO-APIs") there's no reason to write raw sql. Yes, it allows you be very specific about the models and how they are represented in the database with their relations. The code you write in datamapper specifically is for the common case, although you can pass options to just about anything to fine tune the way the tables are designed based on the relationships. Furthermore, if you really want to, you can communicate directly with the store with raw SQL through the API. (Nobody really wants to.) I think most pragmatic developers these days have decided that deriving migrations after describing your model is the most effective way of doing things for most programs. Furthermore, it's agnostic about the underlying database (or the database language!). So you can write your libraries or applications while users the easy option of which technology to plug in. There are adapters for the usual RDBMS suspects, NoSQL stores, various file formats and even some popular webservices without needing to change the way you describe your models and their relationships. Writing raw SQL for everything you do is sort of yesteryear, IMO. There's obviously still a place for such things, but people can effectively write complex database models now without ever touching it in a very natural way.
29 Jan, 2011, Chris Bailey wrote in the 10th comment:
Votes: 0
Well, that makes it much clearer, thanks a lot. I'm not sure what I was missing initially, I guess I just wasn't thinking about it properly. What is the point of a database if you are just serializing raw objects in it? Ha! Your example is exactly what I was looking for to make it all come together. …MB needs to implement a post rating feature or a way to select a post that solved your problem. =)


@Quixadhal - I do appreciate your time, but Runter was right. I was looking for something more specific to datamapper. Thank you though!
30 Jan, 2011, quixadhal wrote in the 11th comment:
Votes: 0
You can drop the snotty attitude Runter. I wasn't LAWL'ing at data mappers in general. They have their uses. However, if you're not going to be doing SQL queries using normal database tools, why use SQL as a backend at all? There are object databases that are much better for that kind of thing.

I guess I'm the last of the programmers who wants to at least try to understand how things actually work, underneath all the fancy API's and tool sets. Nobody cares, because they just throw money and hardware at it until they hit the wall, and then throw up their hands and say, that's the best we can do, sorry.

@Chris, glad you got what you need.
30 Jan, 2011, Runter wrote in the 12th comment:
Votes: 0
quixadhal said:
You can drop the snotty attitude Runter. I wasn't LAWL'ing at data mappers in general. They have their uses. However, if you're not going to be doing SQL queries using normal database tools, why use SQL as a backend at all? There are object databases that are much better for that kind of thing.

I guess I'm the last of the programmers who wants to at least try to understand how things actually work, underneath all the fancy API's and tool sets. Nobody cares, because they just throw money and hardware at it until they hit the wall, and then throw up their hands and say, that's the best we can do, sorry.

@Chris, glad you got what you need.


There was no "snotty attitude" intended. You didn't really dispute any of my points here, though. In fact, your bit about "why use sql, then?" was precisely one of my points. You don't have to use sql with datamapper. Chris never really specified SQL, and datamapper, activerecord, and probably any of the other ORMs Chris could use can be plugged in with plenty of those object databases you suggested simply as a configuration option. Datamapper also makes it easy to migrate from SQL to a new database seamlessly. So I apologize if you think my attitude is poor. I'm simply stating the facts as I see them.
30 Jan, 2011, plamzi wrote in the 13th comment:
Votes: 0
Runter said:
You don't have to use sql with datamapper. Chris never really specified SQL, and datamapper, activerecord, and probably any of the other ORMs Chris could use can be plugged in with plenty of those object databases you suggested simply as a configuration option. Datamapper also makes it easy to migrate from SQL to a new database seamlessly.


I have no real familiarity with object databases but it seems to me that the choice of relational vs. object would depend on what you want as an end goal. What I wanted out of a database I thought could only be done easily with a relational one. I use the C API for MySQL and write not only queries but also DB procedures manually. Some of the features I develop include web interfaces which interact with the live server. PHP and SQL are close companions and make that easy. An object database would not have cut the mustard in that respect, I believe. It would have saved me time in implementing player equipment storage but then would have impeded the addition of further, more custom features. But if player storage is all you want, maybe there's no need to go relational…

And yes, I agree with quixadhal that knowing what exactly is going on under the hood is the key to developing something outstanding, even if your goal is a db-agnostic framework.
30 Jan, 2011, Runter wrote in the 14th comment:
Votes: 0
plamzi said:
I have no real familiarity with object databases but it seems to me that the choice of relational vs. object would depend on what you want as an end goal. What I wanted out of a database I thought could only be done easily with a relational one. I use the C API for MySQL and write not only queries but also DB procedures manually. Some of the features I develop include web interfaces which interact with the live server. PHP and SQL are close companions and make that easy. An object database would not have cut the mustard in that respect, I believe. It would have saved me time in implementing player equipment storage but then would have impeded the addition of further, more custom features. But if player storage is all you want, maybe there's no need to go relational…

And yes, I agree with quixadhal that knowing what exactly is going on under the hood is the key to developing something outstanding, even if your goal is a db-agnostic framework.


Well, the point is you can use whatever database you want. That's sort of a red herring, because I don't care what the underlying technology is. You can write libraries for people with a simple config options for them to change what is used without being married to database-specific language. You say you agree with Quix that knowing what is going on under the hood is the key to good development, but that doesn't sound like what you actually mean. Because I think everyone agrees that knowing what is happening is a good thing. It sounds like you're actually suggesting it's just better to write the raw SQL to make good software. If so, refer back to my other post and debunk it. My point overall is I'm pretty sure you can make good software both ways, but don't discount the many benefits of using ORMs. There are places where they make software better and the draw backs are few. There's plenty of "outstanding software" that uses ORMs. They're outstanding because they're frameworks that are world renown and enterprise tier software runs on them.
31 Jan, 2011, plamzi wrote in the 15th comment:
Votes: 0
Runter said:
Well, the point is you can use whatever database you want. That's sort of a red herring, because I don't care what the underlying technology is. You can write libraries for people with a simple config options for them to change what is used without being married to database-specific language. You say you agree with Quix that knowing what is going on under the hood is the key to good development, but that doesn't sound like what you actually mean. Because I think everyone agrees that knowing what is happening is a good thing. It sounds like you're actually suggesting it's just better to write the raw SQL to make good software. If so, refer back to my other post and debunk it. My point overall is I'm pretty sure you can make good software both ways, but don't discount the many benefits of using ORMs. There are places where they make software better and the draw backs are few. There's plenty of "outstanding software" that uses ORMs. They're outstanding because they're frameworks that are world renown and enterprise tier software runs on them.


In my previous post, I wanted to suggest that neither db-agnostic nor db-specific code is inherently better–it depends entirely on the end goal. I don't know enough about the OP's project to issue recommendations (it sounds like you know that he is writing libraries for other people to use as opposed to a standalone piece of software, but I don't/didn't know that)–all I did was share my specific experience in case it's relevant.

Beyond that, I actually agree with pretty much everything posted by both you and quixadhal because I don't see the points as truly conflicting. It seems to me that we're merely revisiting the age-old debate of whether it's better to build from scratch or re-use. The more abstractly you pose the question, the less likely it is you'll ever get to a useful answer. I would always begin by asking, build what? Based on the goal, the answer can range from "build everything from scratch" to "re-use all the way and add a nice config file on top" (but will usually be somewhere in between).

I'm pretty sure the OP got what he needed in terms of datamapper specifics to take his project further.
01 Feb, 2011, quixadhal wrote in the 16th comment:
Votes: 0
Indeed. My quite possibly mistaken assumption was that if an SQL database was being used as a backend, you'd want to know how to use SQL for complex queries. Re-reading it, Runter is quite right that SQL wasn't actually mentioned, so sorry for pushing out data that probalby wasn't appropriate.

Given the choice, I think using an object database would be faster and probably integrate into your mapping API easier, simply because you DO have to play tricks with SQL to map objects to relationships. If you don't want or need the SQL query language, it's usually better to avoid it.

So, I also apologize for my snippiness. I think I'm just used to seeing extra drama in the posting around here or something. *grin*
03 Feb, 2011, Runter wrote in the 17th comment:
Votes: 0
Well, I do apologize if I came off as snotty either way. Sometimes I think text reads more harsh than you intend. :p
0.0/17