Closed Bug 539544 Opened 15 years ago Closed 14 years ago

Need single node VM for Hadoop+HBase

Categories

(mozilla.org Graveyard :: Server Operations, task)

task
Not set
major

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: dre, Assigned: phong)

Details

We need something the Socorro team can run and connect to via Thrift to be able to test their integration work.

Eventually, we need to consider whether this could be expanded to a very small cluster of nodes to use as a long-running staging environment.
Severity: normal → major
Ken,

Can we get something like this set up early next week just for dev testing?
I'm happy to get some work done next week on a VM for Socorro testing.  Let's talk about the details on IM/IRC Monday and I'll update this ticket with the plan.
Shifting to server ops so they can construct a clone of cm-hadoop-adm01 to use as the baseline for this VM.
Assignee: kmacinnis → server-ops
Group: mozilla-stats
Component: Data/Backend Reports → Server Operations
Product: Mozilla Stats → mozilla.org
QA Contact: data-reports → mrz
Version: unspecified → other
IT, is this showing up in your bug triage?  Want to make sure it doesn't get lost since it was reassigned and the project was changed.
What specifically do you need here?  Just a single VM?  I have some older 1u Lenovo hardware I can re-purpose here too.  

If it's just a single VM, storage space & memory requirements?
Whiteboard: Blocked on comment 5
It's just a single VM for now.  The goal is functional testing.  We'll build out a different infrastructure should the needs grow to performance testing.

The full prototype HDFS was 50GB.  Daniel, do we need all of that data, or a subset?  If we do need all of it, the storage reqs would be near 100GB, since it'll be single replication but will need room to grow and to process individual dumps.  Memory is 16GB+ if possible, 32GB if you can.
We actually don't need *any* data to be preexisting for this VM.  We can set the expectation that the user is only allowed to create at most a hundred crash reports on it.  Hopefully, the VM disk needs be only a GB or two more than the OS+hadoop files.
Should be a typical RHEL VM with a 20GB data drive attached.
Assignee: server-ops → phong
Whiteboard: Blocked on comment 5
Sounds good.  Let's try it, and it sounds like it could be a 10GB data drive at most - and that's likely overly spec'd.
What do you want to name this VM?  Would cm-hadoop-adm04 work?
No, I'd prefer cm-hadoop-dev02 please.  dev01 is used for a lot of our compilation stuff so it isn't suitable for this purpose, but they are both dev oriented boxes.
32 or 64 bit?
cm-hadoop-dev02.mozilla.org 10.2.72.49 is online.
Status: NEW → RESOLVED
Closed: 14 years ago
Resolution: --- → FIXED
I don't have access here.  Can you ensure tbuckner, daniel, and myself (at least) all have login + sudo access a la cm-hadoop-adm01?
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
openssh restarted, it's working now.
Status: REOPENED → RESOLVED
Closed: 14 years ago14 years ago
Resolution: --- → FIXED
Product: mozilla.org → mozilla.org Graveyard
You need to log in before you can comment on or make changes to this bug.