Closed
Bug 539544
Opened 15 years ago
Closed 14 years ago
Need single node VM for Hadoop+HBase
Categories
(mozilla.org Graveyard :: Server Operations, task)
mozilla.org Graveyard
Server Operations
Tracking
(Not tracked)
RESOLVED
FIXED
People
(Reporter: dre, Assigned: phong)
Details
We need something the Socorro team can run and connect to via Thrift to be able to test their integration work. Eventually, we need to consider whether this could be expanded to a very small cluster of nodes to use as a long-running staging environment.
Reporter | ||
Updated•15 years ago
|
Severity: normal → major
Reporter | ||
Comment 1•15 years ago
|
||
Ken, Can we get something like this set up early next week just for dev testing?
Comment 2•15 years ago
|
||
I'm happy to get some work done next week on a VM for Socorro testing. Let's talk about the details on IM/IRC Monday and I'll update this ticket with the plan.
Reporter | ||
Comment 3•14 years ago
|
||
Shifting to server ops so they can construct a clone of cm-hadoop-adm01 to use as the baseline for this VM.
Assignee: kmacinnis → server-ops
Group: mozilla-stats
Component: Data/Backend Reports → Server Operations
Product: Mozilla Stats → mozilla.org
QA Contact: data-reports → mrz
Version: unspecified → other
Reporter | ||
Comment 4•14 years ago
|
||
IT, is this showing up in your bug triage? Want to make sure it doesn't get lost since it was reassigned and the project was changed.
Comment 5•14 years ago
|
||
What specifically do you need here? Just a single VM? I have some older 1u Lenovo hardware I can re-purpose here too. If it's just a single VM, storage space & memory requirements?
Updated•14 years ago
|
Whiteboard: Blocked on comment 5
Comment 6•14 years ago
|
||
It's just a single VM for now. The goal is functional testing. We'll build out a different infrastructure should the needs grow to performance testing. The full prototype HDFS was 50GB. Daniel, do we need all of that data, or a subset? If we do need all of it, the storage reqs would be near 100GB, since it'll be single replication but will need room to grow and to process individual dumps. Memory is 16GB+ if possible, 32GB if you can.
Reporter | ||
Comment 7•14 years ago
|
||
We actually don't need *any* data to be preexisting for this VM. We can set the expectation that the user is only allowed to create at most a hundred crash reports on it. Hopefully, the VM disk needs be only a GB or two more than the OS+hadoop files.
Comment 8•14 years ago
|
||
Should be a typical RHEL VM with a 20GB data drive attached.
Assignee: server-ops → phong
Whiteboard: Blocked on comment 5
Comment 9•14 years ago
|
||
Sounds good. Let's try it, and it sounds like it could be a 10GB data drive at most - and that's likely overly spec'd.
Assignee | ||
Comment 10•14 years ago
|
||
What do you want to name this VM? Would cm-hadoop-adm04 work?
Reporter | ||
Comment 11•14 years ago
|
||
No, I'd prefer cm-hadoop-dev02 please. dev01 is used for a lot of our compilation stuff so it isn't suitable for this purpose, but they are both dev oriented boxes.
Assignee | ||
Comment 12•14 years ago
|
||
32 or 64 bit?
Assignee | ||
Comment 13•14 years ago
|
||
cm-hadoop-dev02.mozilla.org 10.2.72.49 is online.
Status: NEW → RESOLVED
Closed: 14 years ago
Resolution: --- → FIXED
Comment 14•14 years ago
|
||
I don't have access here. Can you ensure tbuckner, daniel, and myself (at least) all have login + sudo access a la cm-hadoop-adm01?
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
Comment 15•14 years ago
|
||
openssh restarted, it's working now.
Assignee | ||
Updated•14 years ago
|
Status: REOPENED → RESOLVED
Closed: 14 years ago → 14 years ago
Resolution: --- → FIXED
Updated•9 years ago
|
Product: mozilla.org → mozilla.org Graveyard
You need to log in
before you can comment on or make changes to this bug.
Description
•